Jump to content

Dec 08

Survey Uncovers Financial Industry Challenges

Financial services organizations rely heavily on information found on public websites, social networks, and web portals to monitor markets, track the competition, identify suspicious fraud activity, maintain sanction lists, automate processes with B2B partners, and listen to what customers are saying. Access to these external sources of both structured and unstructured information typically requires manual integration. This leads to tedious searching, copying and pasting of data into spreadsheets, databases, or applications. This information is very often time sensitive, so organizational reliance on manual processes defeats the notion this gathered information is timely. Time is money. Kapow-FS-survey-infographic-Blog-Post

These organizations also depend on an IT infrastructure to meet these needs and data integration requirements must address the growing need of accessing external data sources. Integrating internal systems with external data sources can be challenging to say the least, especially when organizations are constantly adding new external sources of information to their operations, and these external websites and web portals either don’t provide APIs or the development efforts are too time consuming and costly. Keep in mind, if IT is struggling to keep up with demands of the business to add new sources of information and eliminate manual repetitive activities involving the handling and processing information now, the chance of keeping pace is a target that becomes unobtainable for most financial services organizations.

A recent Computerworld.com survey of 100 financial services professionals highlights the challenges of acquiring and integrating data from multiple data sources, including external websites and web portals. This survey revealed troubling challenges facing the financial industry. Here’s a few highlights:

  • Struggles of integrated external sources: 43% of the participating financial institutions are struggling with a lack of integration between external data sources and internal systems. Of those external sources that financial services organizations need to integrate with, 84% are web portals where business information needs to be extracted and integrated with internal systems and processes.
  • Manual or hard-coded integration: 55% of respondents reported that the integration of data between external sources and internal systems either involves users manually transferring data, or is done through custom integration development that involves a more hard coded approach that does not scale out to support the many external sources of data.
  • Manual data handling: These financial organizations identified the time required to manual import data and perform validations, as the two most costly results challenges when it comes to not being able to integrate external data sources.
  • Deployment delays: Overall, respondents want a solution that quickly adapts to varying data sources, unfortunately integration projects often takes months to complete. Only 8% of the financial services responding indicated that an external information integration project is completed in less than a month, and 31% reported it takes more than 3 months to complete, illustrating the need to find a faster and more efficient way to perform external data integration.

The bottom line is manual processes no longer fit into any financial organization business process. It’s clear these time consuming development projects used to integrate external data sources into an enterprise infrastructure are not a long-term viable strategy.

Financial organizations depend on data, whether it’s being used to transform industries, grow market share, defend brands, or protect customers. It takes an alternative approach to integrating data, one that cannot simply rely on traditional development tools and custom one-off projects. Data integration platforms that are easy to deploy and customize are the next step for external data integration.

Download the complete IDG survey here.

Tagged with:    
Nov 26

Recently, I participated in the yearly CIO Logistics Forum with Henrik Olsen, Head of Business Architecture & Development from DSV, a global supplier of transportation and logistics. The topic: Streamlining Logistics Operations & Automate B2B Processes.

During the presentation, Olsen observed an increasing demand from customers (e.g. manufacturer of goods) for lower prices, improved service and real-time data integration, and freight haulers asking for higher rates to compensate for increased costs.Kapow-Blogpost-diagram-Gross-Profit

With price pressures coming from both customers and freight haulers, one of the few ways to improve profit is to increase operational efficiency through automation of B2B interactions and internal processes.

Typically, automating these processes becomes challenging as more and more customers move away from supporting Electronic Data Interchange (EDI). This is especially true for mid to small-size customers who cannot afford to keep up with the demand for EDI integration.

These smaller customers “dictate” their preferred way of integrating and exchanging information. This integration is typically driven through an email-based solution and/or through a web portal, often using Excel as the interchangeable format.

A typical scenario might go like this:

  • An email with an order is generated directly from the customer’s Enterprise Resource Planning (ERP) or Transportation Management System (TMS).
  • The transportation and logistics supplier receives the email with the shipping order request and, processes the information.
  • The customer requires near real-time update of tracking information posted to their logistics web portal.

Of course, this process is simple for customers since they don’t have to support EDI, and they will choose to only engage transportation and logistics suppliers who allow them to deliver the data in this flexible manner.

For the suppliers, this order becomes more difficult and expensive, as the entire customer integration process becomes manual.

The good news is all these manual processes can be automated, and this is exactly what Henrik explained in his well-received presentation.

DSV plans to automate a considerable amount of these B2B non-EDI interactions, and take advantage of the higher freight prices and better margins that they can obtain from smaller customers. This is what I like to call the long-tail effect (see diagram), where technology like EDI is too expensive and complex to implement for low volume customers, but alternative solutions are available to facilitate the automation and integration between business partners.Kapow-Blogpost-diagram-(1)-Revised

Many transportation and logistics companies all over the world are finding alternatives to EDI where integration costs with smaller customers can be reduced as much as 100 times through complete automation of previous manual B2B processes. The result is a substantial increase in profits and improvements in the bottom line.

Many thanks to Henrik Olsen for presenting on this important topic. If you are curious to see how this works, I recommend you watch this short video.

Stefan Andreasen, Corporate Evangelist Kapow at Kofax.

 

 

 

Tagged with:          
Nov 05

To say pulling data from various internal and external sources is time-consuming is a masterpiece of understatement. Cutting and pasting, using homegrown scripts or applications that record a user’s actions can’t compete with the pace of business. And over time, there will be an increased demand of not only quantity but quality of information.Lots of information is accessible via public websites with more data that’s often hidden beyond firewalls and web portals that require login credentials and ability to navigate the site in order to extract the data. Valuable information is also embedded in PDFs, images, and graphics.Kapow-Blogpost-graphic

 

From start-ups to enterprise organizations and spanning across a variety of industries from financial, transportation, retail, and healthcare, acquiring external data is critical. Whether you want to stay in compliance, move ahead of the competition or reach new markets- it all requires constant monitoring of web data. Data is extracted, transformed, and migrated into various reports and becomes the foundation business decisions are based upon.

So a web-scraping tool or homegrown web scraping approach seems like a good option, since it looks like it’s a quick and inexpensive way to harvest the data you require. Or can it?

Now comes the uneasy feeling in the back of your mind. Can my homegrown web scraping approach or a web-scraping tool acquire the correct information I need? How do I know the data I received is accurate and formatted correctly? And what if management wants different reporting data, how is that handled?

The short answer: You don’t know.

The right answer begins with an evaluation of your specific data requirements and business needs.

  1. How does web scraping acquire the data?

While product demonstrations can present an initial set of data with colorful dashboards, full of charts and reports, you are better off to ask for a technology demonstration that relates to your specific data collection needs. Write up a list of actual websites you gather data from. Your list should include various types of sites from HTML 5, Flash, JavaScript, and AJAX. Be sure to include websites with firewalls and PDFs. The more scalable, reliable, and faster the web data extraction process performs across various external websites, the better.

  1. What does the data look like?

You have received some data using a web scraper tool, but now you spend all your time trying to transform the data. You notice formatting and quality issues with the data. If the extracted data is not accurately transformed and put into a usable format, such as Microsoft Excel, .csv files, or XML, the data becomes unusable by applications that have specific integration requirements. Now you have lost half the value of your purchased investment. Extracting and auto correcting of specialized data often includes dates, currencies, calculations, conditional expressions, plus the removal of duplicate data are all important considerations.

  1. How difficult is it to make changes?

What happens if a website changes or if you need to monitor and extract data from new websites? Many web-scraping tools have a high propensity to fail when websites change, which then requires resources and in some cases a developer to fix the problem. Unless you have a developer in house to make these fixes, this will add additional time and expense, and the problem only grows bigger as you monitor and extract data from hundreds or even hundreds of thousands of websites. If scalability is important to you, be sure to ask how the technology solution monitors and handles changes to a website, especially if you want to expand beyond your immediate data collection needs.

Extracting and transforming web data is more than just purchasing any web-scraping tool. Think about the data you are collecting and how it’s tied to your business. In all likelihood, there’s a strong set of business drivers for collecting the data, and taking shortcuts will only compromise the success of what your business goals are. And it should never make you feel uneasy about the information you are collecting.

Look beyond the data that’s being extracted, and think about what you are doing with it in the context your customers, creating a competitive advantage, or streamlining processes that rely on data from websites, portals, and online verification services.

Kapow-Call-to-Action

Tagged with:          
Sep 17

Customer insights to best practices 

Last week I spoke with John, who leads a web automation team at a Fortune 500 professional staffing company that has been a customer of Kapow for more than 5 years, primarily using Kapow for Customer Relationship Management (CRM) and Human Resource (HR) activities that involve transforming, synchronizing, and delivering information between their Vignette, SharePoint® and Salesforce® applications.

Like almost every enterprise organization their business team’s use Microsoft Excel extensively for data sharing and reporting, and collaboration via Microsoft SharePoint.

As John explains, “Microsoft Excel® is used throughout the organization to capture data within business teams, reporting, or simply for exchanging data, all within SharePoint.”

John elaborates, “Microsoft has really enhanced Excel, which is seen with the improvements in data visualization between Excel 2010 and 2013 versions. Microsoft is also integrating Excel with SharePoint 2013 so you can surface live Excel data directly in web parts in SharePoint. This is the path we are taking and with the new Excel edit feature in Kapow 9.4 we expect to quadruple the use of Kapow over the next year to support it.”

Kapow-blogspot-diagram

I must admit this is very exciting and great to see the same excitement from customers who see the value in automating activities that involve great amounts of data and the use of Excel.

When you use Kofax Kapow to dynamically update live internal and external data in an Excel spreadsheet, this information can in turn be surfaced in SharePoint, making your entire SharePoint platform a collaborative real-time decision-making platform. Data is unlocked from any data source you can think of, including cloud apps, enterprise apps, web portals, emails, active directory and of course SharePoint itself.

Today the company updates all their Excel data repositories and Excel reports manually, which is not only tedious and not very exciting work, but unavoidably will also include human errors, which could become critical to their business.

Some of the data they capture is coming from other departments. Just managing who has done what is a nightmare. That’s why capabilities like Kapow’s advanced logging are so important when it comes to having a full audit trail.

In one example, John explains how they currently receive a separate email when a person joins a training course, which is sent through a SharePoint workflow but then requires a business user to manually key this information into Excel. All this is automated with Kapow 9.4.

John expects that automating manual Excel driven work will be expanding their need for Kapow into more departments, including HR, payroll and employee development. Finally, by combining Excel and Kapow into SharePoint this will drive adoption within data delivery, data visualization, and data collaboration.

Do you have any insights regarding Excel? We would like to hear from you.

Stay tuned for my next customer interview about using Kapow for Excel automation.

Stefan Andreasen

Corporate Evangelist, Kapow & Information Integration

Tagged with:    
Aug 26

Millions of organizations put up with the inefficiencies and risks associated with running critical parts of their business on spreadsheets, with the vast majority using Microsoft Excel ® as their preferred tool. Spreadsheet software isn’t designed to be used in the manner with which most companies use them today. Spreadsheets are handy for ad-hoc analysis, reporting, data exchange, prototyping and other common tasks. In a corporate setting, the repetitive manual tasks needed to acquire and integrate information from internal and external data sources into spreadsheets can lead to costly errors. In addition, spreadsheets are difficult to audit and clumsy to work with in collaborative repetitive business processes such as budgeting, sales, and operational planning, partner data-exchange and cash management.

Ventana Research’s comprehensive report on “Spreadsheets in Today’s Enterprise – Making Intelligent Use of a Core Technology” provides detailed insight into the use of Excel in the typical corporation. Excel is the de-facto format for reports, data-exchange or financial models. According to a study performed by Ventana Research in 2012, 72% of the participants said that their most important spreadsheet are ones that are shared with others.1

A typical Excel based process involves opening a pre-formatted Excel template, complete with multiple work-sheets, pre-built macros, tables, graphics, and then edit/assemble data from a multitude of sources into this template to create the delivery document. Input data can come from systems such as email servers; business applications such as CRM, HR or ERP; bank portals, business partners portals such as financial partners, supply-chain partners, logistics parts; government public web-sites: and finally internal monitoring applications from departments such as IT, Marketing, Procurement, etc.

These Excel reports are then delivered to stakeholders in departments such as, Finance, Sales, IT or externally to business partners through email, FTP upload or portal upload.

Rather than get rid of spreadsheets, which for most companies would be nearly impossible, there is a modern way to cost-effectively automate the acquisition of the data entered while still preserving the familiarity and ease of use of Excel with greater accuracy, ease of collaboration and elimination of tedious manual processes.

FIGURE 1. Manual Excel based process flow.

 

Excel_Automation_Diagram-01

 

 

 

 

 

 

 

 

 

 

Innovative products such as Kofax Kapow allow the business user to define the flow over their complete Excel process with integration directly to all the information sources and destinations. It does not take much longer to create a solution than to perform the work once manually and it can then be repeated over and over again, without human errors. Kofax Kapow also delivers a full audit log of everything that happened and alerts selected persons if anything went wrong.

The value is not only in the automation of the repetitive manual process, but also in increased business revenue from:

  1.  Elimination of human errors.
  2. Near real-time result/delivery for quicker decisions or improved service levels.
  3. Running the process at speeds that would be impossible for a human.

 Excel_Automation_Diagram-03

 

 

 

 

 

 

 

 

 

 

FIGURE 2: Efficient workflow of automated Excel process with Kofax Kapow.

 

Next steps

When I discuss this topic with industry leaders, I typically recommend a number of steps to discover the use of Excel within an enterprise to understand the potential for Excel Automation. These steps include:

  1. Interview business managers in departments who use Excel.
  2. Estimate amount human time used on manual repetitive work.
  3. Estimate the business value from elimination of human errors.
  4. Estimate the business value of freeing employees to make better business decisions.
  5. Think about improving your business by including more data sources or increasing the frequency you acquire data.

From these simple steps you can determine the ROI.

For most companies Excel Automation is a no-brainer.

A future blog post will go through real-life customer examples, so stay tuned.

Comments are welcome at stefan.andreasen@kofax.com

  1. Ventana Research, Spreadsheets in Today’s Enterprise, January 2013.

 

Apr 21

Many factors have contributed to SharePoint’s longevity and success, and David Roe pointed out a few of them in his recent CMS Wire article entitled, “SharePoint: A Formidable Enterprise Collaboration Platform” The article, which summarizes a Radicati Group report on SharePoint, mentions that SharePoint’s ecosystem has been a key contributor to its continued success, and I agree completely. SharePoint functionality is also important, of course, and Microsoft has invested heavily to add social and mobile capabilities throughout SharePoint 2013. But business value doesn’t come from a box: it comes from applying technology like SharePoint—and Kapow Enterprise—to the pressing needs that challenge your business. As part of the SharePoint ecosystem, Kapow improves many of our SharePoint customers’ content processes, from capture to creation to enterprise search—just as we do for all the CMS products we support.

If you have any questions about content migration, give us a call and we can help decide whether we’re right for you. You can get the full Kapow Content Migration story from our white paper on the topic. Attending SPTechCon April 22-25 in San Francisco? Bring your requirements by booth 220 at the exhibit hall. 

Sharepoint

 

 

 

 

Authored by: Carol Kimura, Director, Field Marketing at Kapow Software – a Kofax Company

Apr 08

On March 26 I presented at bpmNEXT 2014, an annual event for leaders in the business process management (BPM) industry, analysts, industry influencers and various vendors. There were nearly one hundred attendees from more than 10 countries. This was one of those events where you come back with great new contacts and a ton of inspiration.

Following welcoming remarks by Nathaniel Palmer and Bruce Silver, who are some of the biggest thought leaders in the industry and the team behind the creation and expansion of the BPM.com community, we jumped right in to the 25 presentations all of which delivered cutting-edge new and innovative BPM demos.

The event was very well orchestrated and organized by Nathaniel and Bruce. At the end of the three-day conference, I can say it was definitely one of the best events I’ve ever attended. I was both honored and proud when my presentation “Automation of Manual Process with Synthetic APIs”, was voted Best in Show by the attendees.  Later this month, you’ll be able to watch all of the bpmNext presentations at www.bpmnext.com

So how does Synthetic APIs  help most business processes?

BPM is all about using a workflow engine from one of many vendors to describe, manage, monitor and improve efficiency of business processes. This can be any process, but most companies normally invest in BPM around critical fundamental processes that drive major parts of their business.

Unfortunately BPM does not help much in automating the individual sub-tasks of the process they manage. This is especially true for the ever increasing amount of web-centric processes and processes involving web portals, because those portals more likely than not do not provide a full set of API that reflects the functionality of the portal itself. This is where Kapow Software enabled Synthetic API technology comes in.

Synthetic APIs, which include business rules, data transformations and interactions with multiple applications and data sources can be deployed as REST, SOAP or mini-apps (Kapow Kapplets™) by the click of a button, are easily built with the intuitive and live-data-driven work-flow design environment of Kapow Enterprise. This makes it a breeze to automate all those tedious repeatable sub-processes involving web portals, documents (like Excel), business applications (like ERP) and file systems (local or FTP). In fact it’s so easy with Kapow Enterprise, that Kapow customers implement hundreds of automations per year, that release important knowledge workers from performing repeatable manual data-driven work  to focus on more relevant and gratifying work that substantially adds to the top-line results.

Many of the more than 250 Kapow customers experience such a huge business benefit and a competitive advantage with the Kapow Enterprise platform, that they ask for us to not mention their name in any circumstance. For more details on Synthetic APIs, check out the Synthetic API on-demand webinar. Comments are also welcome at sandreasen@kapowsoftware.com.

photo

 

 

 

 

 

 

 

 

 

Authored By: Stefan Andreasen, Corporate Evangelist, Data Intergration, Kapow Software – a Kofax Company

Apr 07

Just as Oracle users began to gather for COLLABORATE 14, the IOUG-run conference, I was speaking with one of our long-time Oracle customers and the topic turned to her journey with Kapow.

She began, as many of our customers have, in the middle of a major content migration project—several hundred thousand pages—that had begun to slip almost on the very first day. After meeting with Kapow at COLLABORATE she saw the potential and about a week after bringing Kapow Enterprise in-house she was convinced.

What struck me was just how difficult it was for her—a seasoned content management expert—to believe that it was worth automating a content migration. “In the past I’d used specialists to develop migration scripts but we didn’t find very much reuse,” she told me, and went on to say “Despite what experts advocate, I’ve always had to transform content as part of my projects and scripting isn’t well-suited to that. So I took automation out of my migration toolbox until I found you at COLLABORATE a few years back.”

Since then they have placed Kapow Enterprise at the center of their Information Management function, using it to create content from databases, to integrate support documents from outside their firewall, and even to load richer metadata into the index of their search solution. So whether you need to publish data from an Oracle database or migrate content into WebCenter, give us a call and we can help assess whether we’re right for you. You can get the full Kapow Content Migration story from our white paper on the topic.

And if you’re attending COLLABORATE 14 in Las Vegas, come hear Stephen Moore speak on “Automating Web and Document Migrations” Wednesday, April 9 from 2-3pm. He’s on Level 3, San Polo 3501A (Session 908). And don’t forget to stop by booth 1643 at the Exhibitor’s Showcase for a 1-on-1 discussion of your requirements. 

collab14_2013-07_750

 

 

 

Authored by: Carol Kimura, Director, Field Marketing at Kapow Software – a Kofax Company

Mar 24

Kapow is once again pleased to be part of the Adobe Summit in Salt Lake City on March 24-28; it’s not our first time here and I expect it won’t be our last. Adobe Experience Manager, built on Adobe CQ, is a powerful tool for web marketers and our success in the Adobe community is a reflection of that.

However, I find too often that high-visibility content projects will struggle, or even become high-visibility failures, no matter what the target CMS is.

At this very moment, two of my colleagues at large organizations are in the middle of large content migrations, and both are trying to work through the content freeze that’s considered a “best practice” in content migration. One hasn’t been able to update customer-facing documents for months, leading to mis-quoted sales and even a few customer defections when inconsistent price lists were circulated by email during the freeze period.

But all of that was unnecessary. Thanks to today’s technology lengthy content freezes aren’t needed. We have a white paper that explains why, or even better–If you’re attending the Adobe Digital Marketing Summit stop by our booth #811 and say hello.

summit-na-logo

 

 

 

Authored by: Carol Kimura,  Director, Field Marketing at Kapow Software – a Kofax Company

 

Nov 11

I’m proud to be announcing that Kapow Enterprise 9.3 comes with an integrated WebKit-based browser. This means that when designing or running data integration flows (aka robots) in Kapow KatalystTM against web-based systems or applications, you can take advantage of the impressive HTML5 compatibility and JavaScript performance of WebKit.

For those of you who are not familiar with WebKit, it is the common core between Safari and Chrome (up until the most recent versions of Chrome that are now running on a WebKit fork known as “Blink”). According to StatCounter, the web traffic analysis tool, WebKit is the most widespread web browser engine in use on the internet – ahead of both the IE and Firefox engines.

Fie's blog image

This means that integration flows based on WebKit have a very high likelihood of being compatible with the websites in use around the world. For old legacy systems that you wouldn’t expect to work in Chrome or Safari, we recommend that you continue to use our classic browser engine that is IE-compatible. Having both engines in our product gives you the maximum flexibility in integrating with both cutting-edge web applications as well as those legacy applications that still hold important information and functionality, but are no longer updated to support modern browsers.

Making browsers that are created for human interaction controllable by an integration flow isn’t the easiest thing in the world. It often requires a lot of scripting, trial-and-error and can be hard for others to read and maintain.

But we’ve taken our knowledge of how browsers work, including algorithms to determine when the time is right to take the “next step” (clicking a link or entering data into a form) and wrapped the WebKit engine in this logic, making it easy for you as a user to build integration flows using point-and-click development. These flows are quick to create and easy to maintain over time, providing stable Synthetic APIs  so data can be rapidly integrated from applications and data sources that do not have APIs.

Authored by: Anne-Sofie Nielsen, Vice President of R&D, Kapow Software – A Kofax Company

The Kapow Katalyst Blog is…

... a collection of insights, perspectives, and thought leadership around Application Integration.

Comments, Feedback, Contact Us:

blog at kapowsoftware.com

Get Our RSS Feed