Alteryx Sample Resumes

Accelerate your job hunting with our AlterYX Sample resumes. The industry’s best downloadable Alteryx Sample Resumes templates approved by our experts are available at all levels – beginner, intermediate and advanced professionals. IT professionals or IT beginners can use these formats to prepare resumes and start applying for IT jobs.

Advanced Alteryx Developer Resume Samples –  Here Coding compiler sharing a very useful Advanced Alteryx Resume Samples for Alteryx professionals. This is the Real-Time Alteryx professional Sample resumes contain real-time projects. You can use these Alteryx Developer resumes as a reference and build your own resume and get shortlisted for your next  Alteryx Job Interview. All the best for your future and happy learning.

Alteryx Sample Resume 1


E-mail: ***********

Phone: +91-xxxxxxxxxxx


Experience Summary

  • 3 Years of IT industry experience in the Design, Development and Implementation of Data warehousing Technology and Data Analysis. 
  • Tech Mahindra (May 2012 – Dec 2014) 
  • ZS Associates (Dec 2014 – Till Date) 
  • 3 years of experience in Insurance and Pharma Domain. 
  • Expertise in Data Warehousing projects using Datastage and SQL. 
  • Expertise in using Alteryx Data Blending tool. 
  • Excellent knowledge in SQL and writing complex SQL Queries. 
  • Excellent knowledge and hands on experience in Dimensional Modeling and Data Warehousing Concepts. 
  • Extensive experience with Requirement analysis, design, development, customizations and implementation of Data Warehousing ETL applications 
  • Proficient in analyzing and translating business requirements to technical requirements and architecture. 
  • Good communication and presentation skills and ability to quickly adapt to the technical environment as a developer and a good team player. 
  • Hands on experience in Tuning Jobs process identifying and resolve performance in Parallel Jobs in Datastage 
  • Excellent team player with strong written, communication, analytical skills. Capable of working under high stress environment with resource constraints. 
  • Having expertise on IBM Web sphere Datastage 8.1/8.5v. 
  • Worked on IBM IIW model and legacy data. 
  • Extensively worked on all phases of Data Warehousing(Staging,Work Area,Dimension and Fact loading etc.) 

Professional Experience:

  • Worked as BI developer with [company name], Hyderabad from May 2012 to Dec 2014. 
  • Currently working as Technology Analyst in [company name]. 
Technical Skills:
ToolsDATASTAGE (8.1/8.5v), Alteyrx
DatabaseDB2, Oracle, MS SQL Server 2005
Reporting toolsSAP BO and MDX Cubes


  • Completed B.TECH in SRM University with an aggregate of 6.5(CGPA) in 2007-2011. 
  • Completed Intermediate in Sri Chaitanya , Hyderabad with an aggregate of 78% in 2005-2007 
  • Completed Secondary Education in CBSE school in Delhi with an aggregate of 84% in 2005. 
Client: Bank Of Nova Scotia
Project: GEM_AMLC (Anti Money Laundering Correspondent)
Environment: Data Stage 8.0v, SQL Server 2005
Role: ETL Developer (Datastage).
Duration: Jan 2012-Aug 2012


Automated Monitoring and Surveillance Global Electronic Fund Transfer Monitoring – Anti Money Laundering Compliance (AMS GEM AMLC) project is to provide the AMLU with the ability to monitor, identify and investigate selected EFT/remittance products for money laundering and terrorist financing activity. This project deals with SWIFT (Society for Worldwide Interbank Financial Telecommunication) and Non-Swift bank transactions.

These Transactions are processed in INBOUND process where in each transaction is checked and each parties in the transaction is identified and a party identifier is created.

The newly created parties and the old parties are now loaded into the tables and compared against the list of parties in the suspect groups provided by the customer on monthly basis in the HUB PROCESS. The identified parties and its associated suspect group ID’s are extracted into output file in OUTBOUND PROCESS. The extracted file is monitored manually by another team EXAC daily to check the list parties involved in suspect groups.

The most important sources of this project are

  • Wire Transaction in a data format file which is parsed using Perl parser and the transaction is loaded into respective columns in text format. 
  • Customer Country Reference tables from DB2 
  • BANK reference tables from SQL server. 

Roles & Responsibilities:

  • Participated in development of some of the modules in the project. 
  • Deliver new and complex high quality solutions to clients in response to varying business requirements 
  • Involved in job Maser sequence creation on multiple jobs. 
  • Involved in preparing Test cases for Unit Testing. 
  • Applied performance tuning techniques on slowly running jobs. 
  • Involved in preparing Test cases for Unit Testing. 
  • Involved in peer review 
  • Coordinated with onshore team to meet business requirement. 


Project : Accident Year Data Mart

Environment : Data Stage 8.5v, DB2

Role : ETL Developer (Datastage).

Duration : xxxxx – Present


The project involves in building a Data Mart for various Insurance Policy Types at Coverage level. The data sources are legacy systems known as CPS(Claims Processing Systems) , PPS(Policy Processing Systems) and a Reporting Database build by using IBM IIW framework. The sole objective of the project is to support and provide analytical data to the Actuarial team for various purposes. The major challenge in this project was to understand and analyse the client’s requirement and design the Data Mart accordingly.

Roles & Responsibilities:

  • Individually developed and analysed a good number of modules in the project. 
  • Involved in developing UNIX trigger scripts using Shell scripting to trigger Datastage jobs. 
  • Understanding the client’s Functional requirements and analysing them with Data Model Perspective. 
  • Deliver new and complex high quality solutions to clients in response to varying business requirements 
  • Designed and Developed Data Stage jobs for extracting data, transform and load into the Data warehouse 
  • Involved in preparing Test cases for Unit Testing. 
  • Applied performance tuning techniques on slowly running jobs. 
  • Involved in preparing Test cases for Unit Testing. 
  • Involved in peer review 
  • Coordinated with onshore team to meet business requirement.

Alteryx Sample Resume 2


Email: *********

Phone: +91 xxxxxxxxxxx


  • 5+ years of work experience, this includes experience in Installation, Development and Implementation of Hadoop
  • Experience in dealing with Apache Hadoop components like MapReduce, HDFS, Hive, Pig, Sqoop, Big Data, Pivotal HD, HAWQ, Alteryx.
  • Create workflows to fetch data from different sources to HDFS using Alteryx and Schedule jobs.
  • Worked with Linux commands to examine the data in HDFS.
  • Hands on experience in MapReduce jobs. Experience in installing, configuring and administrating the Hadoop Cluster of Major Hadoop Distributions
  • Experience in design, development of web based applications using HTML, DHTML, CSS, JavaScript, JSP and Servlets
  • Experience in Java Application Development, Client/Server Applications, Internet/Intranet based database applications and developing, testing and implementing application environment using J2EE, JDBC, JSP, Servlets, Oracle, PL/SQL and Relational Databases
  • Experience in working with different operating systems Windows 98/NT/2000/XP, LINUX
  • Good expertise using the various Development tools like Eclipse and Net Beans
  • Good at Relational Database concepts and extensively worked with ORACLE. Very good in writing complex SQL queries and PL/SQL procedures
  • Possess excellent communication, interpersonal and analytical skills along with positive attitude


Big Data :       Hadoop,   HDFS, Hive, Sqoop, Pivotal HD, HAWQ, Alteryx

Languages :       Java

Design and Modeling :       UML

Databases :       Oracle, MS SQL Server, MySQL, SQL/PLSQL

Markup/Scripts :       JavaScript, HTML, XML

Version Control :       CVS, SVN

Environments :       Red Hat Linux, Windows Server2008.


  • Currently working with [company name] since 28-oct-2014 to till date.
  • Worded with [company name] since 11-July-2012 to 24-Oct-2014.
  • Worked with [company name] Since 30-Nov-2009 to 6-Jul-2012



Duration : Nov 2014 to Till Date

Role : Sr. Software Analyst

Team Size : 4

Platform : RHEL 6.5

Environment : Pivotal HD, HDFS, HAWQ, Alteryx.

Client is one of the top telecommunication companies in Australia. The objective of this project is to make the business user to analyze the data and make customized reports to predict the future needs to the business. Alteryx run around 65 jobs to fetch data from different sources and loads data into HDFS.

Alteryx run three types of jobs initial, stage, history and active where initial is one time job to fetch data from the source to destination. In stage job the data is fetched from the source and keep a copy of the data in HDFS. The history job compares the previous day’s data with the current date data and stores the deleted and updated data into history file. The daily jobs fetch the active data and copy of the current date data which is used to compare next day. Create external tables to get the data in tabular format using HAWQ.

Roles & Responsibilities:

  • Create workflows to fetch data from different sources to HDFS using Alteryx and Schedule jobs.
  • Created workflows to compare data from previous day’s data and current day’s data to maintain the history data.
  • Work with Linux commands and HAWQ to monitor the data and external tables.
  • Create schemas, external tables and views to get the data in the tabular format.
  • Create roles and grant privileges to the roles to specified schemas.

Duration : Oct 2013 to Sep 2014

Role : Software Developer

Team Size : 7

Platform : RHEL 6

Environment : Hadoop 1.0.4, MapReduce, HDFS, Hive, Java (jdk1.6)

Nissan is one among world largest automobile companies. At Nissan, there are around 160 main frame based applications. The objective of this project is to do log file analysis of the main frame applications. The huge amount of log file data, which is in the form of main frame proprietary format contains various metrics like CPU usage, memory usage, disk usage. We have written set of multiple MapReduce jobs to parse the log files and converted the data to csv format. We have loaded the data into HDFS and then into Hive tables where we have written Hive join queries to fetch information from multiple tables to perform various analytics.

Roles & Responsibilities:

  • Written multiple java-based MapReduce jobs for data cleaning and preprocessing
  • Load and transform large sets of structured, semi-structured and unstructured data
  • Writing Hive Queries to Load data into Hive Tables using HQL
  • Implementing Hive Joins to merge data from different hive tables
  • Involved in loading data from UNIX file system to HDFS
  • Responsible to manage data coming from different sources and application
  • Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way

Customer Support Management System

Duration : Feb 2013 to Sep 2013

Role : Software Developer

Team Size : 6

Platform : Windows

Environment : J2EE, JSP, HTML, XML, Java Script, JDBC, Servlets

Database : SQL Server


Developed On-Line Web based application for World Brands & Trends. Who needs a single point distributor of quite a few international brands. Through On-Line the retailers and users can login and order for products such as Fashion, Perfumes, Cosmetics, Health Care, Food and etc.  Through this web based application we can do the Shopping, buying and free to delivering the goods to home. This application is very easy to login and ordering the goods. If customers want any information we could interact to the customers


  • WebPages creating and Coding. 
  • User Module Interfaces using JavaBeans.
  • Product Module using JSP and Java.
  • Low level design, Coding and Unit testing of the modules.
  • Web page development, client side validation using java script.

Pharma Plus-Retail System

Duration : July 2012 to Feb 2013

Role : Software Trainee

Team Size : 6

Platform : Windows

Environment : J2EE, XML, HTML, JavaScript.

Database : Oracle


         This Pharma plus Retail system (PPRS) software is developed for Retail pharmacy shops. It has Accounts, Application, Transaction and Report Modules. It automates the Pharmacy Retail shop operations. In Application module, the Retailer can place all Master details such as store, manufacturer, supplier, tax type, product, customer, doctor etc. Admin can view, modify, update and delete the details. The Report division lists the alert, purchase and sales reports. The Transaction module allows you to Purchase, Sale and return of those items. The Account module contains the payment details and the Bank A/C details

Invoice Processing

Duration : Dec 2009 to July 2012

Role : Team Member

Team Size : 13

Platform : Windows

Environment : AS 400, MS Excel, iPortal

Process 1:

Process Description:

BFS (Bankruptcy Foreclosure Services) payoff quotes, deals with foreclosure and bankruptcy properties and provides the outstanding amount along with expenses incurred on foreclosures as Payoff quotes. Process involves Federal Laws and most interactions with attorney firms in US.


  A pay off quote is a quote released to the borrower paying which he/she is relieved of any further obligation to Bank of America. The pay off quote includes all charges assessed on the account during the foreclosure process. The pay off quote is valid for a limited period, beyond which it may change due to additional charges being assessed of the account. The pay off quotes team in India is responsible for following up with the attorney regarding the fees and cost assessed on the account so that it can be included in the pay off quote. They also work with fees sensitive experts to ensure only allowable fees are quoted on to the pay off quote before it is released to the customer. 


  • Pulling of data from applications and assign the same to all the teammates.
  • Working on rush requests, which are send by LOB via Emails.
  • Collating the hourly count of worked loans throughout the day and sending it to management.
  • Doing the cycle time study on latency and process related tasks.

Process 2:

Process Description:

Bank of America maintains the properties which are in foreclosure and maintenance can be done by the third party vendor. Third party vendors repairs and maintain the properties based on the requirement and upload the images that the work performed at the properties along with the invoice. Bank of America pays the third party vender based on works that are performed at the property and generate invoice.


Invoice process is the process of checking invoices which are generated by third party vendor who maintain the foreclosed properties. In this process photos and the invoices can be uploaded into the application called iportal. The photos contains the before, during, after images of the work performed by the venders. We need to check whether the work is performed property or not by looking at the images that are available in the application. If all the images show that the work has been completed then need to check the price list. If the invoice that is sent by the vendor matches with the charge sheet then we need to pay to the vendor.


  • Checking the work performed at the property by looking at the images uploaded into application.
  • Checking for the before, during, after photos which supports the work done by the vendor.
  • Finding out the latest invoice which is related to the work performed by the vendor
  • Matching the invoice charges with the price list whether invoiced correctly or not.
  • If correct need to pay the vendor as the amount charged in the invoice else pay zero.

Related Sample Resumes:

  1. AWS Sample Resume
  2. Ab Initio Sample Resume
  3. AEM Sample Resume
  4. Advanced SAS Resume


  1. SQL Interview Questions
  2. SQL Server Interview Questions For Experienced
  3. RPA Interview Questions And Answers
  4. Android Interview Questions
  5. Mulesoft Interview Questions
  6. JSON Interview Questions
  7. PeopleSoft HRMS Interview Questions
  8. PeopleSoft Functional Interview Questions
  9. PeopleTools Interview Questions
  10. Peoplesoft Technical Interview Questions
  11. 199 Peoplesoft Interview Questions
  12. 200 Blue Prism Interview Questions
  13. Visualforce Interview Questions
  14. Salesforce Interview Questions
  15. 300 SSIS Interview Questions
  16. PHP Interview Questions And Answers
  17. Alteryx Interview Questions
  18. AWS Cloud Support Interview Questions
  19. Google Kubernetes Engine Interview Questions
  20. AWS Devops Interview Questions
  21. Apigee Interview Questions
  22. Actimize Interview Questions
  23. Kibana Interview Questions
  24. Nagios Interview Questions
  25. Jenkins Interview Questions
  26. Chef Interview Questions
  27. Puppet Interview Questions
  28. DB2 Interview Questions
  29. AnthillPro Interview Questions
  30. Angular 2 Interview Questions
  31. Hibernate Interview Questions
  32. ASP.NET Interview Questions
  33. Kubernetes Interview Questions

Leave a Comment