AI EXPRESS - Hot Deal 4 VCs instabooks.co
  • AI
    Google advances AlloyDB, BigQuery at Data Cloud and AI Summit

    Google advances AlloyDB, BigQuery at Data Cloud and AI Summit

    Open source Kubeflow 1.7 set to 'transform' MLops

    Open source Kubeflow 1.7 set to ‘transform’ MLops

    Why exams intended for humans might not be good benchmarks for LLMs like GPT-4

    Why exams intended for humans might not be good benchmarks for LLMs like GPT-4

    How to use AI to improve customer service and drive long-term business growth

    How to use AI to improve customer service and drive long-term business growth

    Why web apps are one of this year’s leading attack vectors

    Autonomous agents and decentralized ML on tap as Fetch AI raises $40M

    Open letter calling for AI 'pause' shines light on fierce debate around risks vs. hype

    Open letter calling for AI ‘pause’ shines light on fierce debate around risks vs. hype

  • ML
    HAYAT HOLDING uses Amazon SageMaker to increase product quality and optimize manufacturing output, saving $300,000 annually

    HAYAT HOLDING uses Amazon SageMaker to increase product quality and optimize manufacturing output, saving $300,000 annually

    Enable predictive maintenance for line of business users with Amazon Lookout for Equipment

    Enable predictive maintenance for line of business users with Amazon Lookout for Equipment

    Build custom code libraries for your Amazon SageMaker Data Wrangler Flows using AWS Code Commit

    Build custom code libraries for your Amazon SageMaker Data Wrangler Flows using AWS Code Commit

    Access Snowflake data using OAuth-based authentication in Amazon SageMaker Data Wrangler

    Access Snowflake data using OAuth-based authentication in Amazon SageMaker Data Wrangler

    Enable fully homomorphic encryption with Amazon SageMaker endpoints for secure, real-time inferencing

    Enable fully homomorphic encryption with Amazon SageMaker endpoints for secure, real-time inferencing

    Will ChatGPT help retire me as Software Engineer anytime soon? – The Official Blog of BigML.com

    Will ChatGPT help retire me as Software Engineer anytime soon? –

    Build a machine learning model to predict student performance using Amazon SageMaker Canvas

    Build a machine learning model to predict student performance using Amazon SageMaker Canvas

    Automate Amazon Rekognition Custom Labels model training and deployment using AWS Step Functions

    Automate Amazon Rekognition Custom Labels model training and deployment using AWS Step Functions

    Best practices for viewing and querying Amazon SageMaker service quota usage

    Best practices for viewing and querying Amazon SageMaker service quota usage

  • NLP
    ChatGPT, Large Language Models and NLP – a clinical perspective

    ChatGPT, Large Language Models and NLP – a clinical perspective

    What could ChatGPT mean for Medical Affairs?

    What could ChatGPT mean for Medical Affairs?

    Want to Improve Clinical Care? Embrace Precision Medicine Through Deep Phenotyping

    Want to Improve Clinical Care? Embrace Precision Medicine Through Deep Phenotyping

    Presight AI and G42 Healthcare sign an MOU

    Presight AI and G42 Healthcare sign an MOU

    Meet Sketch: An AI code Writing Assistant For Pandas

    Meet Sketch: An AI code Writing Assistant For Pandas

    Exploring The Dark Side Of OpenAI's GPT Chatbot

    Exploring The Dark Side Of OpenAI’s GPT Chatbot

    OpenAI launches tool to catch AI-generated text

    OpenAI launches tool to catch AI-generated text

    Year end report, 1 May 2021- 30 April 2022.

    U.S. Consumer Spending Starts to Sputter; Labor Report to Give Fed Look at Whether Rate Increases Are Cooling Rapid Wage Growth

    Meet ETCIO SEA Transformative CIOs 2022 Winner Edmund Situmorang, CIOSEA News, ETCIO SEA

    Meet ETCIO SEA Transformative CIOs 2022 Winner Edmund Situmorang, CIOSEA News, ETCIO SEA

  • Vision
    Data2Vec: Self-supervised general framework

    Data2Vec: Self-supervised general framework

    NVIDIA Metropolis Ecosystem Grows With Advanced Development Tools to Accelerate Vision AI

    NVIDIA Metropolis Ecosystem Grows With Advanced Development Tools to Accelerate Vision AI

    Low Code and No Code Platforms for AI and Computer Vision

    Low Code and No Code Platforms for AI and Computer Vision

    Computer Vision Model Performance Evaluation (Guide 2023)

    Computer Vision Model Performance Evaluation (Guide 2023)

    PepsiCo Leads in AI-Powered Automation With KoiVision Platform

    PepsiCo Leads in AI-Powered Automation With KoiVision Platform

    USB3 & GigE Frame Grabbers for Machine Vision

    USB3 & GigE Frame Grabbers for Machine Vision

    Active Learning in Computer Vision - Complete 2023 Guide

    Active Learning in Computer Vision – Complete 2023 Guide

    Ensembling Neural Network Models With Tensorflow

    Ensembling Neural Network Models With Tensorflow

    Autoencoder in Computer Vision - Complete 2023 Guide

    Autoencoder in Computer Vision – Complete 2023 Guide

  • Robotics
    Gecko Robotics expands work with U.S. Navy

    Gecko Robotics expands work with U.S. Navy

    German robotics industry to grow 9% in 2023

    German robotics industry to grow 9% in 2023

    head shot of larry sweet.

    ARM Institute hires Larry Sweet as Director of Engineering

    Destaco launches end-of-arm tooling line for cobots

    Destaco launches end-of-arm tooling line for cobots

    How Amazon Astro moves smoothly through its environment

    How Amazon Astro moves smoothly through its environment

    Celera Motion Summit Designer simplifies PCB design for robots

    Celera Motion Summit Designer simplifies PCB design for robots

    Swisslog joins Berkshire Grey's Partner Alliance program

    Berkshire Grey to join Softbank Group

    Cruise robotaxi, SF bus involved in accident

    Cruise robotaxi, SF bus involved in accident

    ProMat 2023 robotics recap - The Robot Report

    ProMat 2023 robotics recap – The Robot Report

  • RPA
    What is IT Process Automation? Use Cases, Benefits, and Challenges in 2023

    What is IT Process Automation? Use Cases, Benefits, and Challenges in 2023

    Benefits of Automated Claims Processing in Insurance Industry

    Benefits of Automated Claims Processing in Insurance Industry

    ChatGPT and RPA Join Force to Create a New Tech-Revolution

    ChatGPT and RPA Join Force to Create a New Tech-Revolution

    How does RPA in Accounts Payable Enhance Data Accuracy?

    How does RPA in Accounts Payable Enhance Data Accuracy?

    10 Best Use Cases to Automate using RPA in 2023

    10 Best Use Cases to Automate using RPA in 2023

    How will RPA Improve the Employee Onboarding Process?

    How will RPA Improve the Employee Onboarding Process?

    Key 2023 Banking Automation Trends / Blogs / Perficient

    Key 2023 Banking Automation Trends / Blogs / Perficient

    AI-Driven Omnichannel is the Future of Insurance Industry

    AI-Driven Omnichannel is the Future of Insurance Industry

    Avoid Patient Queues with Automated Query Resolution

    Avoid Patient Queues with Automated Query Resolution

  • Gaming
    God of War Ragnarok had a banner debut week at UK retail

    God of War Ragnarok had a banner debut week at UK retail

    A Little To The Left Review (Switch eShop)

    A Little To The Left Review (Switch eShop)

    Horizon Call of the Mountain will release alongside PlayStation VR2 in February

    Horizon Call of the Mountain will release alongside PlayStation VR2 in February

    Sonic Frontiers has Dreamcast-era jank and pop-in galore - but I can't stop playing it

    Sonic Frontiers has Dreamcast-era jank and pop-in galore – but I can’t stop playing it

    Incredible November Xbox Game Pass addition makes all other games obsolete

    Incredible November Xbox Game Pass addition makes all other games obsolete

    Free Monster Hunter DLC For Sonic Frontiers Now Available On Switch

    Free Monster Hunter DLC For Sonic Frontiers Now Available On Switch

    Somerville review: the most beautiful game I’ve ever played

    Somerville review: the most beautiful game I’ve ever played

    Microsoft Flight Sim boss confirms more crossover content like Halo's Pelican and Top Gun Maverick

    Microsoft Flight Sim boss confirms more crossover content like Halo’s Pelican and Top Gun Maverick

    The Game Awards nominations are in, with God of War Ragnarok up for 10 of them

    The Game Awards nominations are in, with God of War Ragnarok up for 10 of them

  • Investment
    Agreena

    Agreena Raises €46M in Series B Funding

    Translucent

    Translucent Raises £2.7M in Pre-Seed Funding

    Finverity

    Finverity Raises $5M in Equity Funding

    CoinLedger Raises $6M in Funding

    Understanding the Factors that Affect Bitcoin’s Value

    Trobix Bio Raises $3M in Equity Funding

    Trobix Bio Raises $3M in Equity Funding

    Orb

    Orb Raises $19.1M in Funding

    Deep Render

    Deep Render Raises $9M in Funding

    LeapXpert

    LeapXpert Raises $22M in Series A+ Funding

    Superfilliate

    Superfiliate Raises $3M in Seed Funding

  • More
    • Data analytics
    • Apps
    • No Code
    • Cloud
    • Quantum Computing
    • Security
    • AR & VR
    • Esports
    • IOT
    • Smart Home
    • Smart City
    • Crypto Currency
    • Blockchain
    • Reviews
    • Video
No Result
View All Result
AI EXPRESS - Hot Deal 4 VCs instabooks.co
No Result
View All Result
Home Machine Learning

Bring legacy machine learning code into Amazon SageMaker using AWS Step Functions

by
March 15, 2023
in Machine Learning
0
Bring legacy machine learning code into Amazon SageMaker using AWS Step Functions
0
SHARES
2
VIEWS
Share on FacebookShare on Twitter

Tens of hundreds of AWS clients use AWS machine studying (ML) companies to speed up their ML growth with absolutely managed infrastructure and instruments. For patrons who’ve been creating ML fashions on premises, comparable to their native desktop, they wish to migrate their legacy ML fashions to the AWS Cloud to completely benefit from probably the most complete set of ML companies, infrastructure, and implementation sources accessible on AWS.

The time period legacy code refers to code that was developed to be manually run on a neighborhood desktop, and isn’t constructed with cloud-ready SDKs such because the AWS SDK for Python (Boto3) or Amazon SageMaker Python SDK. In different phrases, these legacy codes aren’t optimized for cloud deployment. The most effective apply for migration is to refactor these legacy codes utilizing the Amazon SageMaker API or the SageMaker Python SDK. Nevertheless, in some instances, organizations with a lot of legacy fashions might not have the time or sources to rewrite all these fashions.

On this publish, we share a scalable and easy-to-implement method emigrate legacy ML code to the AWS Cloud for inference utilizing Amazon SageMaker and AWS Step Capabilities, with a minimal quantity of code refactoring required. You’ll be able to simply lengthen this answer so as to add extra performance. We reveal how two completely different personas, an information scientist and an MLOps engineer, can collaborate to carry and shift a whole lot of legacy fashions.

Answer overview

On this framework, we run the legacy code in a container as a SageMaker Processing job. SageMaker runs the legacy script inside a processing container. The processing container picture can both be a SageMaker built-in picture or a customized picture. The underlying infrastructure for a Processing job is absolutely managed by SageMaker. No change to the legacy code is required. Familiarity with creating SageMaker Processing jobs is all that’s required.

We assume the involvement of two personas: an information scientist and an MLOps engineer. The information scientist is answerable for transferring the code into SageMaker, both manually or by cloning it from a code repository comparable to AWS CodeCommit. Amazon SageMaker Studio offers an built-in growth atmosphere (IDE) for implementing varied steps within the ML lifecycle, and the information scientist makes use of it to manually construct a customized container that comprises the required code artifacts for deployment. The container will likely be registered in a container registry comparable to Amazon Elastic Container Registry (Amazon ECR) for deployment functions.

The MLOps engineer takes possession of constructing a Step Capabilities workflow that we are able to reuse to deploy the customized container developed by the information scientist with the suitable parameters. The Step Capabilities workflow could be as modular as wanted to suit the use case, or it could possibly encompass only one step to provoke a single course of. To reduce the trouble required emigrate the code, now we have recognized three modular elements to construct a totally useful deployment course of:

  • Preprocessing
  • Inference
  • Postprocessing

The next diagram illustrates our answer structure and workflow.

The next steps are concerned on this answer:

  1. The information scientist persona makes use of Studio to import legacy code by means of cloning from a code repository, after which modularizing the code into separate elements that comply with the steps of the ML lifecycle (preprocessing, inference, and postprocessing).
  2. The information scientist makes use of Studio, and particularly the Studio Picture Construct CLI software offered by SageMaker, to construct a Docker picture. This CLI software permits the information scientist to construct the picture instantly inside Studio and routinely registers the picture into Amazon ECR.
  3. The MLOps engineer makes use of the registered container picture and creates a deployment for a selected use case utilizing Step Capabilities. Step Capabilities is a serverless workflow service that may management SageMaker APIs instantly by means of using the Amazon States Language.

SageMaker Processing job

Let’s perceive how a SageMaker Processing job runs. The next diagram reveals how SageMaker spins up a Processing job.

SageMaker takes your script, copies your knowledge from Amazon Easy Storage Service (Amazon S3), after which pulls a processing container. The processing container picture can both be a SageMaker built-in picture or a customized picture that you just present. The underlying infrastructure for a Processing job is absolutely managed by SageMaker. Cluster sources are provisioned throughout your job, and cleaned up when a job is full. The output of the Processing job is saved within the S3 bucket you specified. To study extra about constructing your individual container, check with Construct Your Personal Processing Container (Superior Situation).

The SageMaker Processing job units up your processing picture utilizing a Docker container entrypoint script. You may also present your individual customized entrypoint through the use of the ContainerEntrypoint and ContainerArguments parameters of the AppSpecification API. Should you use your individual customized entrypoint, you could have the added flexibility to run it as a standalone script with out rebuilding your pictures.

For this instance, we assemble a customized container and use a SageMaker Processing job for inference. Preprocessing and postprocessing jobs make the most of the script mode with a pre-built scikit-learn container.

Stipulations

To comply with alongside this publish, full the next prerequisite steps:

  1. Create a Studio area. For directions, check with Onboard to Amazon SageMaker Area Utilizing Fast setup.
  2. Create an S3 bucket.
  3. Clone the offered GitHub repo into Studio.
See also  Enabling hybrid ML workflows on Amazon EKS and Amazon SageMaker with one-click Kubeflow on AWS deployment

The GitHub repo is organized into completely different folders that correspond to varied levels within the ML lifecycle, facilitating straightforward navigation and administration:

Migrate the legacy code

On this step, we act as the information scientist answerable for migrating the legacy code.

We start by opening the build_and_push.ipynb pocket book.

The preliminary cell within the pocket book guides you in putting in the Studio Image Build CLI. This CLI simplifies the setup course of by routinely making a reusable construct atmosphere which you could work together with by means of high-level instructions. With the CLI, constructing a picture is as straightforward as telling it to construct, and the end result will likely be a hyperlink to the situation of your picture in Amazon ECR. This method eliminates the necessity to handle the complicated underlying workflow orchestrated by the CLI, streamlining the picture constructing course of.

Earlier than we run the construct command, it’s necessary to make sure that the function operating the command has the required permissions, as specified within the CLI GitHub readme or associated publish. Failing to grant the required permissions can lead to errors in the course of the construct course of.

See the next code:

#Set up sagemaker_studio_image_build utility
import sys
!{sys.executable} -m pip set up sagemaker_studio_image_build

To streamline your legacy code, divide it into three distinct Python scripts named preprocessing.py, predict.py, and postprocessing.py. Adhere to greatest programming practices by changing the code into capabilities which are referred to as from a primary perform. Be certain that all crucial libraries are imported and the necessities.txt file is up to date to incorporate any customized libraries.

After you set up the code, package deal it together with the necessities file right into a Docker container. You’ll be able to simply construct the container from inside Studio utilizing the next command:

By default, the picture will likely be pushed to an ECR repository referred to as sagemakerstudio with the tag newest. Moreover, the execution function of the Studio app will likely be utilized, together with the default SageMaker Python SDK S3 bucket. Nevertheless, these settings could be simply altered utilizing the suitable CLI choices. See the next code:

sm-docker construct . --repository mynewrepo:1.0 --role SampleDockerBuildRole --bucket sagemaker-us-east-1-0123456789999 --vpc-id vpc-0c70e76ef1c603b94 --subnet-ids subnet-0d984f080338960bb,subnet-0ac3e96808c8092f2 --security-group-ids sg-0d31b4042f2902cd0

Now that the container has been constructed and registered in an ECR repository, it’s time to dive deeper into how we are able to use it to run predict.py. We additionally present you the method of utilizing a pre-built scikit-learn container to run preprocessing.py and postprocessing.py.

Productionize the container

On this step, we act because the MLOps engineer who productionizes the container constructed within the earlier step.

We use Step Capabilities to orchestrate the workflow. Step Capabilities permits for distinctive flexibility in integrating a various vary of companies into the workflow, accommodating any current dependencies that will exist within the legacy system. This method ensures that every one crucial elements are seamlessly built-in and run within the desired sequence, leading to an environment friendly and efficient workflow answer.

Step Capabilities can management sure AWS companies instantly from the Amazon States Language. To study extra about working with Step Capabilities and its integration with SageMaker, check with Handle SageMaker with Step Capabilities. Utilizing the Step Capabilities integration functionality with SageMaker, we run the preprocessing and postprocessing scripts utilizing a SageMaker Processing job in script mode and run inference as a SageMaker Processing job utilizing a customized container. We accomplish that utilizing AWS SDK for Python (Boto3) CreateProcessingJob API calls.

Preprocessing

SageMaker affords a number of choices for operating customized code. Should you solely have a script with none customized dependencies, you’ll be able to run the script as a Carry Your Personal Script (BYOS). To do that, merely move your script to the pre-built scikit-learn framework container and run a SageMaker Processing job in script mode utilizing the ContainerArguments and ContainerEntrypoint parameters within the AppSpecification API. This can be a simple and handy methodology for operating easy scripts.

Try the “Preprocessing Script Mode” state configuration within the sample Step Functions workflow to grasp the way to configure the CreateProcessingJob API name to run a customized script.

Inference

You’ll be able to run a customized container utilizing the Construct Your Personal Processing Container method. The SageMaker Processing job operates with the /decide/ml native path, and you’ll specify your ProcessingInputs and their native path within the configuration. The Processing job then copies the artifacts to the native container and begins the job. After the job is full, it copies the artifacts specified within the native path of the ProcessingOutputs to its specified exterior location.

Try the “Inference Customized Container” state configuration within the sample Step Functions workflow to grasp the way to configure the CreateProcessingJob API name to run a customized container.

Postprocessing

You’ll be able to run a postprocessing script identical to a preprocessing script utilizing the Step Capabilities CreateProcessingJob step. Working a postprocessing script lets you carry out customized processing duties after the inference job is full.

Create the Step Capabilities workflow

For rapidly prototyping, we use the Step Capabilities Amazon States Language. You’ll be able to edit the Step Capabilities definition instantly through the use of the States Language. Confer with the sample Step Functions workflow.

See also  Using Amazon SageMaker with Point Clouds: Part 1- Ground Truth for 3D labeling

You’ll be able to create a brand new Step Capabilities state machine on the Step Capabilities console by deciding on Write your workflow in code.

Step Capabilities can take a look at the sources you utilize and create a job. Nevertheless, you might even see the next message:

“Step Capabilities can’t generate an IAM coverage if the RoleArn for SageMaker is from a Path. Hardcode the SageMaker RoleArn in your state machine definition, or select an current function with the right permissions for Step Capabilities to name SageMaker.”

To handle this, it’s essential to create an AWS Identification and Entry Administration (IAM) function for Step Capabilities. For directions, check with Creating an IAM function in your state machine. Then connect the next IAM coverage to supply the required permissions for operating the workflow:

{
    "Model": "2012-10-17",
    "Assertion": [
        {
            "Effect": "Allow",
            "Action": [
                "sagemaker:createProcessingJob",
                "sagemaker:ListTags",
                "sagemaker:AddTags"
            ],
            "Useful resource": "*"
        },
        {
            "Impact": "Enable",
            "Motion": [
                "iam:PassRole"
            ],
            "Useful resource": "*",
            "Situation": {
                "StringEquals": {
                    "iam:PassedToService": "sagemaker.amazonaws.com"
                }
            }
        }
    ]
}

The next determine illustrates the stream of knowledge and container pictures into every step of the Step Capabilities workflow.

The next is a listing of minimal required parameters to initialize in Step Capabilities; you too can check with the sample input parameters JSON:

  • input_uri – The S3 URI for the enter information
  • output_uri – The S3 URI for the output information
  • code_uri – The S3 URI for script information
  • custom_image_uri – The container URI for the customized container you could have constructed
  • scikit_image_uri – The container URI for the pre-built scikit-learn framework
  • function – The execution function to run the job
  • instance_type – The occasion kind you’ll want to use to run the container
  • volume_size – The storage quantity dimension you require for the container
  • max_runtime – The utmost runtime for the container, with a default worth of 1 hour

Run the workflow

We’ve damaged down the legacy code into manageable elements: preprocessing, inference, and postprocessing. To assist our inference wants, we constructed a customized container geared up with the required library dependencies. Our plan is to make the most of Step Capabilities, profiting from its capability to name the SageMaker API. We’ve proven two strategies for operating customized code utilizing the SageMaker API: a SageMaker Processing job that makes use of a pre-built picture and takes a customized script at runtime, and a SageMaker Processing job that makes use of a customized container, which is packaged with the required artifacts to run customized inference.

The next determine reveals the run of the Step Capabilities workflow.

Abstract

On this publish, we mentioned the method of migrating legacy ML Python code from native growth environments and implementing a standardized MLOps process. With this method, you’ll be able to effortlessly switch a whole lot of fashions and incorporate your required enterprise deployment practices. We introduced two completely different strategies for operating customized code on SageMaker, and you’ll choose the one which most closely fits your wants.

Should you require a extremely customizable answer, it’s advisable to make use of the customized container method. You could discover it extra appropriate to make use of pre-built pictures to run your customized script if in case you have primary scripts and don’t must create your customized container, as described within the preprocessing step talked about earlier. Moreover, if required, you’ll be able to apply this answer to containerize legacy mannequin coaching and analysis steps, identical to how the inference step is containerized on this publish.


In regards to the Authors

Bhavana Chirumamilla is a Senior Resident Architect at AWS with a robust ardour for knowledge and machine studying operations. She brings a wealth of expertise and enthusiasm to assist enterprises construct efficient knowledge and ML methods. In her spare time, Bhavana enjoys spending time along with her household and fascinating in varied actions comparable to touring, mountain climbing, gardening, and watching documentaries.

Shyam Namavaram is a senior synthetic intelligence (AI) and machine studying (ML) specialist options architect at Amazon Internet Providers (AWS). He passionately works with clients to speed up their AI and ML adoption by offering technical steerage and serving to them innovate and construct safe cloud options on AWS. He makes a speciality of AI and ML, containers, and analytics applied sciences. Exterior of labor, he loves taking part in sports activities and experiencing nature with trekking.

Qingwei Li is a Machine Studying Specialist at Amazon Internet Providers. He obtained his PhD in Operations Analysis after he broke his advisor’s analysis grant account and didn’t ship the Nobel Prize he promised. Presently, he helps clients within the monetary service and insurance coverage trade construct machine studying options on AWS. In his spare time, he likes studying and instructing.

Srinivasa Shaik is a Options Architect at AWS primarily based in Boston. He helps enterprise clients speed up their journey to the cloud. He’s captivated with containers and machine studying applied sciences. In his spare time, he enjoys spending time along with his household, cooking, and touring.

Source link

Tags: AmazonAWSBringcodeFunctionslearningLegacymachineSageMakerstep
Previous Post

Cover Genius Acquires Clyde

Next Post

Breakthrough discovery challenges current understanding of photoemission

Next Post
Image showing quantum source

Breakthrough discovery challenges current understanding of photoemission

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Newsletter

Popular Stories

  • Wordle on New York Times

    Today’s Wordle marks the start of a new era for the game – here’s why

    0 shares
    Share 0 Tweet 0
  • iOS 16.4 is rolling out now – here are 7 ways it’ll boost your iPhone

    0 shares
    Share 0 Tweet 0
  • Increasing your daily magnesium intake prevents dementia

    0 shares
    Share 0 Tweet 0
  • Beginner’s Guide for Streaming TV

    0 shares
    Share 0 Tweet 0
  • Twitter’s blue-check doomsday date is set and it’s no April Fool’s joke

    0 shares
    Share 0 Tweet 0

ML Jobs

View 115 ML Jobs at Tesla

View 165 ML Jobs at Nvidia

View 105 ML Jobs at Google

View 135 ML Jobs at Amamzon

View 131 ML Jobs at IBM

View 95 ML Jobs at Microsoft

View 205 ML Jobs at Meta

View 192 ML Jobs at Intel

Accounting and Finance Hub

Raised Seed, Series A, B, C Funding Round

Get a Free Insurance Quote

Try Our Accounting Service

AI EXPRESS – Hot Deal 4 VCs instabooks.co

AI EXPRESS is a news site that covers the latest developments in Artificial Intelligence, Data Analytics, ML & DL, Algorithms, RPA, NLP, Robotics, Smart Homes & Cities, Cloud & Quantum Computing, AR & VR and Blockchains

Categories

  • AI
  • Ai videos
  • Apps
  • AR & VR
  • Blockchain
  • Cloud
  • Computer Vision
  • Crypto Currency
  • Data analytics
  • Esports
  • Gaming
  • Gaming Videos
  • Investment
  • IOT
  • Iot Videos
  • Low Code No Code
  • Machine Learning
  • NLP
  • Quantum Computing
  • Robotics
  • Robotics Videos
  • RPA
  • Security
  • Smart City
  • Smart Home

Quick Links

  • Reviews
  • Deals
  • Best
  • AI Jobs
  • AI Events
  • AI Directory
  • Industries

© 2021 Aiexpress.io - All rights reserved.

  • Contact
  • Privacy Policy
  • Terms & Conditions

No Result
View All Result
  • AI
  • ML
  • NLP
  • Vision
  • Robotics
  • RPA
  • Gaming
  • Investment
  • More
    • Data analytics
    • Apps
    • No Code
    • Cloud
    • Quantum Computing
    • Security
    • AR & VR
    • Esports
    • IOT
    • Smart Home
    • Smart City
    • Crypto Currency
    • Blockchain
    • Reviews
    • Video

© 2021 Aiexpress.io - All rights reserved.