Equinox Gets Analytics in Shape with Data Warehouse Modernization

Equinox Gets Analytics in Shape with Data Warehouse Modernization



hello everyone welcome to today's webinar my name is rod Maggiore i lead the team of solutions architects at AWS today we have our customer Angela who is the ETL architect at Equinox to share with you how Equinox is delivering insights and analytics with informatica and redshift we also have deepak from informatica to talk about their partnerships their partnership and offerings that they've built on top of AWS as we go through this presentation you will have opportunity to ask questions to the chat window you see will also try to answer as many of them as possible in the hour we have in the event that we were not able to answer your questions today we'll follow up with everyone individually via email also note that this webinar is being recorded slides and link to the slides would be made available to you with the next two to three days after the session so please keep an eye out for the email with that let's begin what we've seen is companies that mined data are more valuable than ever and it has been evidenced by an Aberdeen surveys that was conducted which said that the companies that are taking advantage of data by investing in day delay type projects are growing the revenue much faster than those aren't we all know the fact that the data is growing at an ever-increasing pace and we also know that there are more ways to analyze data both in open source and cloud than ever before with the abundance of data we also see that we have abundance of different tools and we have abundance of users who are trying to get value from data so while it's becoming a very common ask to democratize the data across organization we also need to make sure that we have the right data governance frameworks in place to prevent the mismanagement of the data the second trend we see is that with traditional or garb data warehouses as the data size grows they become really slow and expensive they also lack the flexibility to work with other amory engines they don't work very well with open formats and schemas and leading to dissatisfaction among the new generation of analyst population and it's also very important for us to understand that the choice of our data warehouse affects more than just the data warehousing project we need to make sure that the tools that we invest in are able to seamlessly work across different day zones so for example you might be storing highly structured data in data warehouse you might be storing raw semi structured data through JSON files in your daily lakes and you might also have some highly structured data for exploratory query purposes sitting in your data Lake traditional data warehouse is the way they operate they treat this data zones as disconnected silos they don't have quite the integrations in place to stitch together the view of those different data zones into a cohesive picture and on top of that these traditional data warehouses we see are super expensive Amazon redshift is the most popular data warehouse that's growing today in the market we have over 10,000 plus customers who are processing over two exabytes of data per day with Amazon redshift and bear in mind that the traditional data warehouses are very slow in their approach to adopt the cloud patterns to step up the step up today's needs that our analyst population requires and that's the reason why we see the shift where customers are adopting Amazon redshift asked their choice of cloud data warehouse on AWS and also Amazon redshift is designed with performance in mind it is the fastest growing cloud data warehouse in the market and the only two times faster than the next data cloud cloud deliver house you see and we've also benchmarked against our own self and it's 10x faster than what it was two years ago and if you recently looked at the new concurrency features that we've announced and delivered this year if now consistently consistently delivers fast performers at scale even with unlimited users and concurrency so if you look at the history of Amazon redshift you're gonna see that we've delivered over 200 features and enhancements based on the lessons we've learned by processing over two exabytes of data that customers pushed through red chip and on top of that unlike traditional data warehouses that are expensive and impose punitive licensing terms Amazon redshift is the most cost effective and fastest-growing data warehouse today it's up to 25% cheaper than the next cloud data warehouse when you use on demand pricing and with your reserve instance pricing you can get up to 75% cost savings and it's the only data warehouse today that's available with your reserved instance pricing giving you the best price performance ratio in running your workloads and dramatically improving your total cost of ownership and as we state the point before that traditional data warehouses don't typically have a cloud approach to how they are working with rest of the AWS ecosystem services unlike those traditional data warehouses Amazon redshift integrates with your data late so you can not only query and report on the data that's loaded indirect shift tables but you can also query the tables stored in open formats in s3 such as our C or park' through a capability that redshift offers is wet ship spectrum traditional data warehouses when the running cloud they often run in the silos however redshift integrates with the rest of the AWS ecosystem so you have you will see the integrations built with our native services such as a Tina Sage maker Canisius for real-time streaming Amazon near Marfa processing needs quick site and sage maker and if the continue in the future you will see more and more services will directly integrate with redshift and will step through these different use case patterns in a minute here the most common pattern we see is business intelligence where you collect data and aggregate data from various transactional sources such as business applications operational data stores into s3 once the data is in s3 glue crawlers can stand scan that data and populate your metadata catalog and once that metadata catalog is populated redshift gives a couple of options you can either load those tables into redshift or you can directly query the tables that are in s3 to redshift using redshift spectrum in this instance you don't have to really load data into redshift you can use word shift to reach out into s3 to get insights into it and customers have multiple options in how they want to visualize that data they can use Amazon quick site or they can rely on third-party tools in AWS partner ecosystem to use a tool that fit for their visualization needs the second use case we see is predictive analytics where you see customers typically use Amazon s3 to bring in the raw data sets from multiple sources and they also use Amazon EMR and hive to transform the unstructured data and combine that with some of the columns within redshift and use Apache spark to process the data and create a combined view that's persisted within redshift and from there forward they can utilize tools like informatica to cleanse process and rich the data such that it's ready for downstream advanced analytics such as for machine learning and anomaly detection and other types of advanced machine learning capabilities you want to build and once that data is available in highly structured refined cleansed form you can extract that and store it in Amazon s3 as your training data set so in this instance you could use Amazon sage maker for your model building training and deployment purposes and get predictions out of that data the third use case we see is real-time analytics and real-time analytics we see customers use a log stream click stream or events that are coming from sensors out in the wild and they use Amazon Kinesis firehose to bring that data in a no code fashion where Amazon Kinesis allows you to also perform real-time analytics by expressing sequel queries or Java functions to get insights from that data or perform ETL upon that data and then eventually send that data to redshift through Amazon Kinesis firehose destination for redshift and this is done in a format where a user or a developer doesn't have to write a ton of code this is the power of redshift how it has hooked it hooks into other AWS native services where instructing this pipeline is a relatively configuration based process and there are a number of use cases we see that customers are building using redshift in financial services we see customers that are analyzing trade and market data to do risk analysis and foreign detection in healthcare space we see customers use a redshift for analyzing clinical records to improve patient outcomes in advertising customers are doing all sorts of clickstream analysis to improve their our ad targeting in gaming we are seeing customers analyze data from games and players to analyze Indian behavior we are also seeing a great uptake and travel area where customers are using word shift to create personalized experiences and offers for their customers we would love to see what you would come up and build with redshift at this point I would like to turn it over to debug from informatica to talk about their offerings and also the relationships but it'll be less debug take it away you thanks for your insights into data and analytics my name is Deepa gram I'm a solution architect with informatica for some who may not be familiar with us we are leaders in cloud enterprise data management I'm trying to next okay our mission is to simply accelerate your data-driven journeys and accelerate your transformation digital transformation modernizing data warehouse may be tricky and comes up with its own set of complex challenges right off the bat data is scattered across your on-prem systems SAS and cloud so customers have the first hug wait where do I find the data how do I discover my data how do I understand the data which I need to bring to AWS how do I then get my data into a de pierres how do I ensure the data which I'm getting into aid of this is actually usable meaning controlling the quality of it making sure that all the transformations are applied then how do I identify my most sensitive data like PHP III pH I and secure how do I get then monetizing the data how do I get a 360 view of my entity meaning a customer a product or a priori supply or so forth then how do I continue governing this data and keep a track of data across my on Prem and ADA player systems we add informatica can actually help all our customers overcome these challenges our platform a platform is built on micro services starting with the bottom of the stack you have a unified connectivity we have 200 plus out of bark connectivity to different sources have a engine unified monitoring over this we built the artificial intelligence layer we call clear all our products are using our AI technologies and to even simplify further with both solutions such as enterprise data preparation data governance and we are able to then in just any kind of data any format any like Rohit mentioned whether it's streaming whether it's batch micro batching for the data management capabilities I touched upon in my earlier slide we are recognized as industry leaders for all the quadrants starting from iPass which is our integration services integration platform as a service data quality master data management metadata management and data integration we are at the right hand top quadrant in every of this capability for four most common paths for data where warehouse modernization what we see customers may have choices to either migrate in this case when I say migrate like many of you on this call may have a mandate to move and sunset your aunt from warehouses and move your workloads into AWS use the efficiencies and scalability AWS offers this is a typical migrate or lift and shift use case there's an extant path which is you have you may have come across a capacity issue with your on-prem on on Prem system or you may have different integrations which are not the usual integrations for this you may choose to still have your on-prem warehouse and still extend newer use cases to support newer integrations into a double use for some fortunate of us may have the third path which is start fresh direct on AWS – regardless whichever path whatever integration path you take we will still have the first challenge which is where do I start start where do I find the data and discover for for that we have enterprise data catalog here we provide a semantic search where you can search anything any any of the entity across the enterprise we provide you with rich profiling data quality statistics and all this is automatically done through our AI powered discovery classifications and business context once you understand the data next step is how do I get the data into a day please we have our intelligent cloud services for this and it provides a wizard-based codeless development it pro it gives you a lot of productivity options such as parameterizing and so forth it natively connects into alias major services and it also has two hundred-plus connectivity outside areas for other SAS and north rim it comes the platform itself comes with the robust H a pushdown optimization and you know tread level isolations some stats around our intelligent cloud services eight trillion transactions seven million integrations a day and over 300% growth in the API integrations this is a busy slide if we had more time I would have gone into details this talks this slide talks about our architecture with our connectivity into redshift the slide what slide depicts is how we optimize and take advantage of red ship native compute and also optimize the thread level isolation so you get the performance gain so in in redshift we have slices which are the compute units so we understand those compute units we make sure we give you the performance needed by having the multi-part upload and also compressing and encrypting the data while we do this another integration pattern we come across is mass ingestion we have a large amounts of data which needs to be quickly and rapidly moved into AWS for this we have mass ingestion its capability is to move massive amounts of data very fast into redshift and s3 a typical warehouse architecture starting from the bottom of the slide deck you have you see the data exists in your on-prem systems you have data coming through from cloud and other SAS sources here you have you have ability through informatica to migrate the data into redshift using the optimized multi-part encrypted way and once you have the data in staging we are clustered aware at this point we support even the ELT which is extract load transform we support full push down all your logic can be pushed down to redshift and you can take advantage of the fast computer a chip provides once you transfer the data into the intermediate ski schemas you can then have ability to master the data also apply more uplifting transformations and move it to the consumption area like I mentioned we have services support for AWS all the major services we use we can integrate with EMR as a compute we support into redshift s3 Aurora Kinesis dynamodb all of our products are actually certified to run on AWS for we've also automated some of solutions as well as our product deployments in a quick one-click fashion so that you could deploy our products in a very fast price of very fast way to summarize we are informatica have a proven track for 25 years and we are leaders in the enterprise data management Kyle and we have thousands of customers and we will be hearing soon from one of our customers Equinox it's my pleasure to introduce engine gunday engine thank you the book and for the introduction and thank you everyone on the call my name is engine and I am representing Equinox for today's webinar I am very very excited to talk about Equinox a data journey and the success story on how we transformed our legacy data warehousing tools into modern cloud-based digital and figure platform using cloud offering from AWS and informatica before that I would like to share a little bit about equinox I don't know how many of you are familiar with the equinox Equinox is America's high-end luxury fitness company it around the 15,000 employees those of which include world-class trainers or to train about members with the full cycle of their fitness and lifestyle goals we have over a hundred Equinox clubs located in every major city in the United States they have few international located in UK and Canada esra there are 200 place other locations offerings under Equinox flagship brand those are linked Fitness which is an another less expensive Fitness alternative we have pure yoga soul cycle furthermore which is a media brand under Equinox and there is a latest unique new offering is Equinox hotels where our guests or members experience a high-performance living environment and they reimagine how they eat sleep move work and live our goal is to infuse a lot very equinox flow facilities into luxury Knox hotels this is a place where compromise simply doesn't exist so the first Equinox totally is coming up next month in New York City what we do unlike other fitness companies where members to check-in use a treadmill and lift some weights but I think Equinox we provide a unique experience to our members based on their lifestyle centered on their moment nutrition and regeneration also you know showcases finds bass research on cutting edge topics focusing on mental engagement breath work movement quality community and respect to life along with state-of-the-art equipment in each and every Club our line of business offers personal training Pilates spa group fitness retail food nutritional services and more that's why our Equinox motto is it's not just fitness we call its life so I'm just wondering if you're not making ox member yet please do check out with your closest to eat Knox club and experience the difference this kind of luxury and sophisticated facilities we needed to have a robust data analytical system in place to analyze the member experiences assist the business growth and generate accurate reports to higher management and business users for that we build our first-ever data warehouse called life back in 2008 if you notice the name we it matches our Equinox motto life and the live data warehouse was hosted on sequel server used informatica power center toolset for data integration needs power center was running on sequel server business objects for our analytical and reporting needs and all the good stuff around it it was a few additional data warehouse pretty much we followed Kimball methodology and designed fully integrated the store schemas and all that good stuff we built it solid and it met all our requirements of analytical and reporting needs almost five years but you know sometimes life wasn't good because of the latest innovations digitization and introduction of latest equipment in our each lostness clubs we needed to collect data from latest connected fitness equipment and analyze and create member experiences around it on the fly we had to store this kind of data somewhere so that our data scientists analysts can take a look at it and analyze it so life data warehouse was not handling these new requirements properly and apart from these new requirements we started seeing some bottlenecks in our processes sizable for example deployment process was not a jl friendly unable to integrate the teak stream data as I mentioned before from our connected equipment unable to meet high demand reporting needs and the SLA is increase on infrastructure and expensive commercial software and licensing most importantly growing data due to our tremendous growth in our equinox businesses as a result we had to begin searching for an alternative solution this is pretty much common scenario right for every data warehouse experience after it reaches its limits that's when we need to rethink and adjust it accordingly to continue our business needs and LTL needs so at the time in order to overcome all these challenges we thought about expanding and upgrading our infrastructure on-premise infrastructure to big box to commercial software's and databases and we bought those servers and started prototyping our current life data warehouse on to leech but immediately we started seeing some overheads for example we needed to have a platform specific knowledge all across the board needed to hire new DBAs who knows support the new architecture to support our development team had to undergo training on about this new infrastructure new technologies to understand to implement our strategies again we notice limited integration of our data sources most importantly it was very very expensive so we spent almost six months of effort on those strategy but no results were produced he had to up until that effort and go back to drawing pull again our main goal was at the time to have an economical sophisticated and friendly technologies which should serve about modern data warehouse and analytical needs so pretty much basically people looking for a bright new futuristic modern data a little solution during that critical decision time our vice president of data analytics such as to go for and look into cloud technology size is more knowledgeable in AWS cloud and other you know technologies around it and immediately we started on the proposal and that's when our modern data warehouse named Jarvis was born on a deafness cloud you may notice the name it is iron man's computer jockeys as the name says he designed and planned this project as smart and as effective either Iron Man's computer is so on that design that tool set we adapted was Amazon redshift to host the data warehouse jollies exactly at the same time informatica was offering informatica intelligent cloud services and he adapted the tool set eyes over data integration platform it was very good fit for Amazon redshift I thought and we used the s3 for ever rode a Dalek purpose also utilized some of the AWS offerings and tools along the bay and business objects and tableau as forever reporting the analytical tools we deployed many more homegrown APRs to interact and it is meant to six if you look at the equation at the bottom our join this project built on AWS infrastructure using all the offerings with the iacs as our integration data integration platform this is how our new data warehouse high-level architecture looks like if you see the left side of our various sources there are many more but I put some important ones here if you look at the engage is over the Salesforce one of the source more slowly is the one of our most member based information collected database source and social media and all other sources so we integrated these data sources using informatica intelligent cloud services pretty much 90% of all data integration is through ISEs and also we use the Amazon EMR rastering to store state some data isin even at the bottom we extract the data on to s3 for some analytical purposes so once we staged the data onto Amazon redshift our ELT processes kicks off unloads the various data Mart's within redshift the data is then fed into analytical engine for reporting some data will consume by internal api's to send it to third-party marketing apps and various research and analytical purposes so at the bottom also we have our own homegrown data quality and mounting tool set to assess the data quality before even he present it so this is the in terms of for Jarvis design methodology it's more like star schema methodology still but we designed it a little bit more flat in structures as we all know it shipped is columnar so having white tables is completely acceptable and we get very good very performance with it since we flatten the facts and dimension structures in this new design we eliminated Genki dimensions and the bridge tables this strategy helped us to avoid complexity in query building around the data marts for our analytical teams so pretty much we created a very very user-friendly analytical system so that our analyst team don't need to build complex queries also with conservatively designed and used type 2 dimensions which we you know don't need that many the major change we did was from our life to charli's ETL intents you to elt intensive methodology so in other words we designed very lightweight EDL's the data integration part the some data kills you on top of it to stage the data on to redshift and Biltmore business rules and logic on to elt scripts which will run on Amazon redshift infrastructure to load the facts and dimensions very fast in fact within minutes so we pretty much took advantage of the tools toolset we have in hand to achieve optimal performance and maximum benefit out of them and that's how we balanced you know what load I sell so with this infrastructure we built thousands of iacs informatica intelligence cloud services data synchronization tasks data replication tasks and mapping tasks to stage the data with the past performance and created the hundreds of ELT scripts which run on Amazon redshifts MVP architecture to load the data model so just want to mention a few things here on ISEs informatica eyes already you know debug mentioned is a leader in data integration domain at the right time there cloud offering of iacs helped us tremendously in our platforming effort due to its user friendliness we started using the tool from the day one of development effort is another beauty of for iacs well platform II so it's pretty much I can do any tasks related to integration with a thing you'll find on of course are depending on the user access level I can create a connection deploy develop any task it could be synchronization tasks the replication tasks it could be mapping the task flow execution of those tasks close monitoring and execution and troubleshooting all in one place so that's the very you know user-friendly tool you don't need to switch between apps or anything and important exports of the iacs assets between the environments for deployment also very very easy it does have a AWS redshift native connectivity and the data loading performance from any source to redshift using iacs and it does have plenty of other connector to connect us to support and extract data from any source in the market that's what we pretty much are able to integrate the data from all our variety of sources if not sources without any issues there are you know releasing and implementing more and more features or to make it more user friendly environment be noticing that again I say all infrastructure is concerned I just want to give you high level info on our current cloud infrastructure we have broad environment and not run on four environment a produce with dc28 large cluster with two nodes or non prod is sitting on TS to extra large cluster to six nodes with development test and the QA efforts at the same time all our informatica based services also hosted on now it'll be a secure virtual private cloud environment we have created for ia CSD by integration environment as well one is the production QA test and development so after all this successful design development and implementation of our the new modern data warehouse Gianni's we are able to fulfill all our analytical needs and we reached our goal of successfully implementing this modern design so we designed variety of processes around our job is the remote data warehouse for example in the bail processing which is the bad processing it runs once a day to toss various you know previous DS data and also we designed some more intraday processes which runs in hourly basis and generates some important reports and we also created some near real-time reporting to some of the processes which runs every ten minutes to generate some the detail reports so we pretty much you know designed all these processes to meet our successful meet our business requirements and the reporting sls using some made abused offerings we also developed some bots integration to optimally manage our aid of these resources as an example one bot itself frequently to see any cluster sitting out there not using so it will warn and kill them so that you know we can save the resources and time and money so due to our selection of tools and the user friendliness we also have a smart team in place there so we needed to have anything required to adapt this new tool set so it mentioned on cost-saving basis we are pretty much in line with the cost saving with a redshift solution and I ACS solution over our traditional on-premise data warehouse strategies we tried early on so both AWS and I ACS cloud environments provide high availability on their services in last few years a be noticed minimal or no impact on you know any of those services during their updates or the patching etc we hardly encountered any major issues but if there is any then you know they wonderful support team jumping and resolve with pity so though our analytical environment is very critical for recording but in case of the start of any IAC a secured agent services they start up for it cluster services in any unavoidable situations and we will be able to do that in few minutes so however I am saying is so we don't experience any downtime at all using these services so with all these user friendliness of the tools that we have chosen the pretty much focused on building the functionality of the data warehouse rather than spending a lot on you know focus using the toolset usage this helped a lot again for our team's productivity so with all this wonderful support from AWS and I ICS and most importantly we have the best of the best smart team we have at our equinox and we are able to complete the entire reef flat forming apart from life to Jolly's just within one year with this redshift integration we were able to integrate our Business Objects and a tableau analytical tool set with Amazon redshift and able to generate accurate timely reporting I self and our end user business community is very happy with the reports and the baby present the content to them it's you know which is very great we all know when our business and the end users are happy everything is happy right so as for our data journey is concerned it's just a beginning that's what we eat we always look for more opportunities advancements and you know continuously evolve to sync with the modern digital world of data and toolset so at least for now we are very happy smoothly sailing along with our modern ethical system so with that that's our success story hope everyone liked it and someone may relate to it once again thank you all for listening with that now I would like to turn it over to deeper Thank You Engine for sharing such an innovative solution this was very informative now that you have learned and heard from AWS and Equinox how they were able to modernize the data warehouse I wanted to leave you with steps to get started what I have here is a quick start this this is a not made QuickStart solution come includes informatica data catalog to discover your data in includes intelligent cloud services to migrate the data into redshift and the solution brings up redshift s3 and also a visualization layer tableau this solution comes with test data and user guide we have some trial licenses as well the next step is next asset we have we're going to be releasing a self-serve self-serve assessment this is a free on Prem assessment of your data warehouse and it gives recommendations to move to AWS this end of this month more information about infamous of this assessment can be found at informatica comm / AWS it also has this website also has a lot of other relevant information for AWS thank you on to you Rohit thanks deeper and thanks Arjun for sharing the great work you've done with informatica and redshift we are now going to transition into our live Q&A you'll be able to submit any written questions through the questions panel you see in the event we are not able to answer your questions today we'll follow up with everyone individually via email so go ahead start putting your questions into the chat window so the first question I have here is to America so deeper when should i leverage glue and men should i leverage informatica will you provide your insight into it yeah sure so we I mean we have customers using blue and informatica I ICS in conjunction our cloud services most of our customers have different use cases one solution does not fit everything but I'll give you a general guideline which is most of our customers if it's a very simple the data already resides in AWS meaning s3 and it needs to be transferred over to redshift they generally look at using glue if it is a more complex integration which requires you to get the data from different source systems and you know you do not write any custom coded out you know connectors at that point you generally use informatica to migrate the data in just the data as well if you have most of our customers have enterprise grade so they have schedulers jobs and all these integrations with data quality master data when you have your integrations require these kind of use cases that's when you use informatica I hope this answers the question great thank you for that the next question I have is for on Jen there are di tools in market why did you choose informatica ICS while redesigning your modern cloud-based enterprise data warehouse Deepak you want to share your insight sorry Anjali you want to share your insight it's true right that's a very good question during our tool evaluation time you know the platforming we looked into other data integration tool set but q2 ease of use and as I mentioned earlier people are using power center tools for almost 20 years I so long and it was easy decision easy choice for us to go for well known user 22 then experimenting others and we already talked about it it's informatica is well known and is being a leader in data integration domain which we adapted and you know that's a easy choice for us to go for iacs offering other than experimenting on other tools so waste our time so then that was the choice we made and it's greatly working for us and we are very happy for their offerings and services provided really back to you oh thank you thanks engine the next question is for informatica what is informatica session to compete against or for streaming frameworks like flink or Kinesis as informatica developer how to extend skill set to streaming frameworks deeper would you like to comment yeah so informatica has a great story for streaming we have intelligent cloud services as well as our big data product supports teaming a lot of our customers are using streaming so it's it's just an extension of the different use cases we support great thanks deeper question for engine do you still have on-premises and price data warehouse yeah a little bit still I know until I think last month is slowly migrated each and every report each and every functionality into cloud-based until last month I think we are now a hundred percent full-fledged cloud-based modern digital data warehouse Charlie's so unfamiliar you know be completely removed at this time great deepak this was for you was i ICS available informatica cloud or in AWS infrastructure so iose is the compute layer which supports i ICS is driven by secure agent secure agent can be hosted in AWS and can be hosted on premise well we have our own hosted agent as well which would serve as this herbalist if if you were to pricing I say good and this may be a question for engine here so in your architecture engine why EMR was used with informatica to ingest the data well in dramatical could process the data yeah we have different you know requirements and you know any difficulty in looking for different approaches to toward the data it's whatever the process we will consistently be used informatica for that but there is certain na analysis where data scientist needs we built that you know I showed you in my diagram we just seen now a.m. or clusters and or the data on to s3 and that part is the unstructured data on different kind of for data we store on SD to analyze you know the different processes but as I mentioned 90% the data integration through iacs but certain things with some kind of a special requirement ha we used that Maximilian homeroom tool to extract the data using the aid of this VM or it's just a completely requirement you know on demand based methodology I see there are two other follow-up questions for you and Jen so stay ahead what challenges you faced in transitioning from your on-premises enterprise data warehouse to AWS and then did you see your face any setbacks in AWS cloud when you tried comparing it to hosting the database locally yeah pretty much is more advantageous as I mentioned on-premise versus on cloud the we have not faced any roadblocks or issues during the conversion process actually none because we figured out that instead of migrating of existing our data warehouse then rebuilding a new processes was more economical since we also changed our methodology from my ETL base to ELT based so we had to develop our complete even data integration tasks on iacs and develop eld scripts on redshift so pretty much we did this completely and a new way not to migrate anything from the old to new it might cause some problems for the next year and there so we approach our approach was to start everything from the scratch by keeping our fundamental business rules in place so with that approach it worked very well as I mentioned within one year the entire conversion happened without any glitches or anything in the timing wise the performance wise we tremendously experienced high performance because you know the ELT methodology and we took we took advantage of redshift so MPP architecture and the loading as I mentioned that intraday process every five minutes loading everything was going very smooth and no glitches and no bottlenecks and you know which is a very high performance and the high scalable data warehouse in place right now that's great another question for you engine here why did Equinox choose AWS and not GCP what was your consideration same same answer I am is being you know leader in clothes technologies and it provides a well secure environment and there is no other you know solutions are close to the AWS and as I mentioned during my presentation our team is more equipped with AWS technologies than anything else and our vice president of data analytics team is more knowledgeable that's the reason in a daily basis we are using whatever the latest tool is available and we are using it of a purpose and are you know utilizing the services to answer the question straight it's a leader in cloud technologies and there's no other solution close to AWS and just we are up to date I say great question for Deepak does I ICS to place our center so I assess addresses many set of integration patterns we do not see that I I see us will replace power center power center will still exist continue to support integrations and so will be I ICS the main thing with our integration services is that it is built on a very new micro services so you will have as a user you will have a unified experience whether to use data integration or API integration data quality master data management catalog everything from one unified experience Rohit great question thank you this is the question for likely engine or the book whoever it is for give can you give us some examples where this solution helped you in your daily transactions may be engine you want to take a shot at the question doesn't have a lot of context but try to answer your best yeah again the comparing you know our traditional approach to cloud approach because of the environment because of the power of the tools and they're not having so much bottlenecks on the you know compared to on-premise to the cloud the processing windows tremendously reduced to meet our you know reporting needs they're pretty much close to real-time reporting environment we created with this tool set and again as I mentioned we are maintaining pretty much a multi terabyte data warehouse on these areas or a chick clustered and if you think about it's not a big data but we have a quality of the data and the process on minimal window timings to meet about every requirement and no glitches while loading the staging the data also no performance issues while processing the data data mods on the raid ship itself so compared to traditional and cloud-based DC a tremendous performance improvement on the baby we designed it and services offered by AWS and I ACS are pretty much all these these things help does a lot not to have any part in next week during our processes hope that helps answers your question thanks I'm Jenny so we have we've got time for one other question here this is likely for debug we are using informatica to move data to AWS over a Postgres but the only have ODBC connector does informatica use a native connector to redshift prove it yes we have a native connectivity to redshift it the architecture is was explained in the webinar but it is a very high performance native connectivity – Richard I see great well thanks engine and debug on that note we're going to wrap up today's webinar as a reminder you will receive an email within next two to three days with link to the slides and recording of today's webinar you want to thank you all very much for attending if you have any other questions please don't hesitate to reach out thank you very much and have a wonderful day

Leave a Reply

Your email address will not be published. Required fields are marked *