AI EXPRESS
  • AI
    Amazon iRobot play takes ambient intelligence efforts to next level

    Amazon iRobot play takes ambient intelligence efforts to next level

    NNAISENSE announces release of EvoTorch, a rare open-source evolutionary algorithm

    NNAISENSE announces release of EvoTorch, a rare open-source evolutionary algorithm

    What Do You Think Life Will Be In 2050?

    What Do You Think Life Will Be In 2050?

    Meta Is Building a Supercomputer to Dance With Its Competitors

    Meta Is Building a Supercomputer to Dance With Its Competitors

    US federal court upholds ruling that AIs cannot patent inventions

    US federal court upholds ruling that AIs cannot patent inventions

    Kid Android Nikola Can Exhibit Human Emotions

    Kid Android Nikola Can Exhibit Human Emotions

  • ML
    MLOps at the edge with Amazon SageMaker Edge Manager and AWS IoT Greengrass

    MLOps at the edge with Amazon SageMaker Edge Manager and AWS IoT Greengrass

    python dictionary append

    Python dictionary append: How to do it?

    Promote feature discovery and reuse across your organization using Amazon SageMaker Feature Store and its feature-level metadata capability

    Promote feature discovery and reuse across your organization using Amazon SageMaker Feature Store and its feature-level metadata capability

    Optimal pricing for maximum profit using Amazon SageMaker

    Optimal pricing for maximum profit using Amazon SageMaker

    Amazon Comprehend announces lower annotation limits for custom entity recognition

    Amazon Comprehend announces lower annotation limits for custom entity recognition

    python __init__

    Python __init__: An Overview – Great Learning

    Scale YOLOv5 inference with Amazon SageMaker endpoints and AWS Lambda

    Scale YOLOv5 inference with Amazon SageMaker endpoints and AWS Lambda

    Simplify iterative machine learning model development by adding features to existing feature groups in Amazon SageMaker Feature Store

    Simplify iterative machine learning model development by adding features to existing feature groups in Amazon SageMaker Feature Store

    add python to path

    How to Add Python to Path

  • NLP
    rpa

    RPA gathers steam with Siri-like NLP

    Klangoo FinTech Challenge Winners Announced

    Klangoo FinTech Challenge Winners Announced

    The 10 Best SaaS Companies of 2022 

    The 10 Best SaaS Companies of 2022 

    Real-time Analytics News for Week Ending April 2

    Real-time Analytics News for Week Ending August 6

    You Need To Stop Doing This On Your AI Projects

    You Need To Stop Doing This On Your AI Projects

    Holographic exhibit of Jewish survivors, and more, comes to Aspen

    Holographic exhibit of Jewish survivors, and more, comes to Aspen

    Supply Chain: How AI can bring transparency and visibility to supply chains, improve security and traceability of products

    Supply Chain: How AI can bring transparency and visibility to supply chains, improve security and traceability of products

    Struggling with drug labels data? Why you should consider natural language processing

    Struggling with drug labels data? Why you should consider natural language processing

    Cynthia.io Emerges From Stealth with Pre-seed Funds

  • Vision
    Deep Learning for Image Dehazing- The What, Why, and How

    Deep Learning for Image Dehazing- The What, Why, and How

    How to train and use a custom YOLOv7 model

    How to train and use a custom YOLOv7 model

    viso.ai Logo

    Deep Learning for Person Re-Identification (2022)

    NVIDIA Jetson AGX Orin 32GB Production Modules Now Available; Partner Ecosystem Appliances and Servers Arrive

    NVIDIA Jetson AGX Orin 32GB Production Modules Now Available; Partner Ecosystem Appliances and Servers Arrive

    viso.ai Logo

    Guide to Generative Adversarial Networks (GANs) in 2022

    viso.ai Logo

    14 Applications of Computer Vision in Construction (2022 Guide)

    Pattern Matching With Normalised Greyscale Correlation

    Pattern Matching With Normalised Greyscale Correlation

    Filters In Convolutional Neural Networks

    Filters In Convolutional Neural Networks

    Inside the Artificial Intelligence program that creates images from textual descriptions

    Inside the Artificial Intelligence program that creates images from textual descriptions

  • Robotics
    Waku Robotics secures $1.64M seed round

    Waku Robotics secures $1.64M seed round

    ouster sensors

    LiDAR maker Ouster brings in $10.3M, loses $28M in Q2

    Geek+

    Geek+ raises another $100M for AMRs

    robotire

    RoboTire installs its first system at Discount Tire

    Amazon to acquire iRobot; Robotics at DHL with Sally Miller

    Amazon to acquire iRobot; Robotics at DHL with Sally Miller

    amazon

    Inside Amazon’s robotics ecosystem – The Robot Report

    Amazon buying iRobot for $1.7B

    Amazon buying iRobot for $1.7B

    forwardx mobile robot in a warehouse

    ForwardX officially launches AMRs in U.S.

    FedEx purchasing $200M of Berkshire Grey robots

    FedEx purchasing $200M of Berkshire Grey robots

  • RPA
    How to Create a Rock Solid Technology Portfolio with Hyperautomation?| AutomationEdge

    How to Create a Rock Solid Technology Portfolio with Hyperautomation?| AutomationEdge

    Unlocking the Top Healthcare Automation Trends with Use Cases that Rule the World| AutomationEdge

    Unlocking the Top Healthcare Automation Trends with Use Cases that Rule the World| AutomationEdge

    Staying Ahead of the Time with AI-Powered Customer Experience

    Staying Ahead of the Time with AI-Powered Customer Experience| AutomationEdge

    Why is Developing Decision Intelligence with AI Support Crucial in Healthcare?

    Why is Developing Decision Intelligence with AI Support Crucial in Healthcare?

    Robotic Process Automation using Blue Prism

    Robotic Process Automation using Blue Prism

    AI- The Tech Medicine Ameliorating the Healthcare Industry?

    AI- The Tech Medicine Ameliorating the Healthcare Industry?| AutomationEdge

    Take employee experience into hyperdrive with Hyperautomation

    Hyperautomation- Your Answer to Enhance Employee Experience| AutomationEdge

    Know Why Automation Now Resides in the Heart of Customer Contact Centers| AutomationEdge

    Know Why Automation Now Resides in the Heart of Customer Contact Centers| AutomationEdge

    Conversational AI, Healing the Healthcare Industry| AutomationEdge

    Conversational AI, Healing the Healthcare Industry| AutomationEdge

  • Gaming
    Oops! Nintendo Almost Leaked The Splatoon 3 Direct A Day Early

    Oops! Nintendo Almost Leaked The Splatoon 3 Direct A Day Early

    Pac-Man munching his way onto the silver screen with a live action movie in development

    Pac-Man munching his way onto the silver screen with a live action movie in development

    Elden Ring patch 1.06 brings gifts for heavy weapon users, and White Mask Varre fans who don't care for PvP

    Elden Ring patch 1.06 brings gifts for heavy weapon users, and White Mask Varre fans who don’t care for PvP

    If you want rollback netcode, you’re going to have to play Dragon Ball FighterZ on PS5, Xbox Series X/S, or PC

    If you want rollback netcode, you’re going to have to play Dragon Ball FighterZ on PS5, Xbox Series X/S, or PC

    Star Wars: KOTOR II Premium And Master Physical Editions Revealed For Switch

    Star Wars: KOTOR II Premium And Master Physical Editions Revealed For Switch

    EVO was dominated by rollback netcode announcements, and I couldn't be happier

    EVO was dominated by rollback netcode announcements, and I couldn’t be happier

    Resident Evil Remakes are fine and all - but I’d trade them for more Dead Rising

    Resident Evil Remakes are fine and all – but I’d trade them for more Dead Rising

    Xenoblade Chronicles 3’s release feels like the Switch is teetering on the cusp of a real piracy problem

    Xenoblade Chronicles 3’s release feels like the Switch is teetering on the cusp of a real piracy problem

    Pokémon GO Is Joining The 3D Billboard Craze In Japan

    Pokémon GO Is Joining The 3D Billboard Craze In Japan

  • Investment
    Financial Venture Studio Closes Fund II, at $40M

    Financial Venture Studio Closes Fund II, at $40M

    Smartfluence

    Smartfluence Closes $2.15M Seed Funding Round

    Pendulum Raises $5.9M in Seed Funding

    Privya Raises $6M in Funding

    Modern Life

    Modern Life Raises $15M in Seed Funding

    canvass_logo_square

    Canvass AI Closes US $14.23M Series A Extension Funding

    LiveEO Raises €19M in Funding

    LiveEO Raises €19M in Funding

    overtime-1

    Overtime Raises $100M in Series D Funding

    Auddy

    Auddy Raises £2.5M in Seed Funding

    carbonstop

    Carbonstop Secures USD14.82M in Round-B Financing

  • More
    • Data analytics
    • Apps
    • No Code
    • Cloud
    • Quantum Computing
    • Security
    • AR & VR
    • Esports
    • IOT
    • Smart Home
    • Smart City
    • Crypto Currency
    • Blockchain
    • Reviews
    • Video
No Result
View All Result
AI EXPRESS
No Result
View All Result
Home Cloud

Review: Snowflake aces Python machine learning

by
August 5, 2022
in Cloud
0
Blue holiday snowflake background
0
SHARES
2
VIEWS
Share on FacebookShare on Twitter

Final yr I wrote about eight databases that help in-database machine studying. In-database machine studying is necessary as a result of it brings the machine studying processing to the information, which is rather more environment friendly for large knowledge, fairly than forcing knowledge scientists to extract subsets of the information to the place the machine studying coaching and inference run.

These databases every work otherwise:

  • Amazon Redshift ML makes use of SageMaker Autopilot to routinely create prediction fashions from the information you specify by way of a SQL assertion, which is extracted to an Amazon S3 bucket. The very best prediction perform discovered is registered within the Redshift cluster.
  • BlazingSQL can run GPU-accelerated queries on knowledge lakes in Amazon S3, cross the ensuing DataFrames to RAPIDS cuDF for knowledge manipulation, and eventually carry out machine studying with RAPIDS XGBoost and cuML, and deep studying with PyTorch and TensorFlow.
  • BigQuery ML brings a lot of the ability of Google Cloud Machine Studying into the BigQuery knowledge warehouse with SQL syntax, with out extracting the information from the information warehouse.
  • IBM Db2 Warehouse features a broad set of in-database SQL analytics that features some fundamental machine studying performance, plus in-database help for R and Python.
  • Kinetica gives a full in-database lifecycle answer for machine studying accelerated by GPUs, and may calculate options from streaming knowledge.
  • Microsoft SQL Server can practice and infer machine studying fashions in a number of programming languages.
  • Oracle Cloud Infrastructure can host knowledge science assets built-in with its knowledge warehouse, object retailer, and capabilities, permitting for a full mannequin improvement lifecycle.
  • Vertica has a pleasant set of machine studying algorithms built-in, and may import TensorFlow and PMML fashions. It will possibly do prediction from imported fashions in addition to its personal fashions.

Now there’s one other database that may run machine studying internally: Snowflake.

Snowflake overview

Snowflake is a totally relational ANSI SQL enterprise knowledge warehouse that was constructed from the bottom up for the cloud. Its structure separates compute from storage with the intention to scale up and down on the fly, at once or disruption, even whereas queries are operating. You get the efficiency you want precisely while you want it, and also you solely pay for the compute you employ.

Snowflake at present runs on Amazon Net Providers, Microsoft Azure, and Google Cloud Platform. It has just lately added Exterior Tables On-Premises Storage, which lets Snowflake customers entry their knowledge in on-premises storage methods from firms together with Dell Applied sciences and Pure Storage, increasing Snowflake past its cloud-only roots.

Snowflake is a totally columnar database with vectorized execution, making it able to addressing even probably the most demanding analytic workloads. Snowflake’s adaptive optimization ensures that queries routinely get the most effective efficiency doable, with no indexes, distribution keys, or tuning parameters to handle.

Snowflake can help limitless concurrency with its distinctive multi-cluster, shared knowledge structure. This enables a number of compute clusters to function concurrently on the identical knowledge with out degrading efficiency. Snowflake may even scale routinely to deal with various concurrency calls for with its multi-cluster digital warehouse function, transparently including compute assets throughout peak load durations and cutting down when hundreds subside.

Snowpark overview

Once I reviewed Snowflake in 2019, in case you needed to program in opposition to its API you wanted to run this system outdoors of Snowflake and join by way of ODBC or JDBC drivers or by way of native connectors for programming languages. That modified with the introduction of Snowpark in 2021.

See also  Cloud computing and the metaverse

Snowpark brings to Snowflake deeply built-in, DataFrame-style programming within the languages builders like to make use of, beginning with Scala, then extending to Java and now Python. Snowpark is designed to make constructing advanced knowledge pipelines a breeze and to permit builders to work together with Snowflake instantly with out shifting knowledge.

The Snowpark library gives an intuitive API for querying and processing knowledge in an information pipeline. Utilizing this library, you may construct purposes that course of knowledge in Snowflake with out shifting knowledge to the system the place your software code runs.

The Snowpark API gives programming language constructs for constructing SQL statements. For instance, the API gives a choose methodology that you should utilize to specify the column names to return, fairly than writing 'choose column_name' as a string. Though you may nonetheless use a string to specify the SQL assertion to execute, you profit from options like clever code completion and sort checking while you use the native language constructs supplied by Snowpark.

Snowpark operations are executed lazily on the server, which reduces the quantity of information transferred between your shopper and the Snowflake database. The core abstraction in Snowpark is the DataFrame, which represents a set of information and gives strategies to function on that knowledge. In your shopper code, you assemble a DataFrame object and set it as much as retrieve the information that you just wish to use.

The information isn’t retrieved on the time while you assemble the DataFrame object. As an alternative, if you end up able to retrieve the information, you may carry out an motion that evaluates the DataFrame objects and sends the corresponding SQL statements to the Snowflake database for execution.

snowpark python 01 IDG

Snowpark block diagram. Snowpark expands the interior programmability of the Snowflake cloud knowledge warehouse from SQL to Python, Java, Scala, and different programming languages.

Snowpark for Python overview

Snowpark for Python is accessible in public preview to all Snowflake clients, as of June 14, 2022. Along with the Snowpark Python API and Python Scalar Consumer Outlined Capabilities (UDFs), Snowpark for Python helps the Python UDF Batch API (Vectorized UDFs), Desk Capabilities (UDTFs), and Saved Procedures.

These options mixed with Anaconda integration present the Python neighborhood of information scientists, knowledge engineers, and builders with quite a lot of versatile programming contracts and entry to open supply Python packages to construct knowledge pipelines and machine studying workflows instantly inside Snowflake.

Snowpark for Python features a native improvement expertise you may set up by yourself machine, together with a Snowflake channel on the Conda repository. You should use your most popular Python IDEs and dev instruments and have the ability to add your code to Snowflake realizing that it will likely be suitable.

By the way in which, Snowpark for Python is free open supply. That’s a change from Snowflake’s historical past of protecting its code proprietary.

The next pattern Snowpark for Python code creates a DataFrame that aggregates e-book gross sales by yr. Underneath the hood, DataFrame operations are transparently transformed into SQL queries that get pushed right down to the Snowflake SQL engine.

from snowflake.snowpark import Session
from snowflake.snowpark.capabilities import col

# fetch snowflake connection info
from config import connection_parameters

# construct connection to Snowflake
session = Session.builder.configs(connection_parameters).create()

# use Snowpark API to combination e-book gross sales by yr
booksales_df = session.desk("gross sales")
booksales_by_year_df = booksales_df.groupBy(yr("sold_time_stamp")).agg([(col("qty"),"count")]).type("rely", ascending=False)
booksales_by_year_df.present()

Getting began with Snowpark Python

Snowflake’s “getting began” tutorial demonstrates an end-to-end knowledge science workflow utilizing Snowpark for Python to load, clear, and put together knowledge after which deploy the skilled mannequin to Snowflake utilizing a Python UDF for inference. In 45 minutes (nominally), it teaches:

  • Methods to create a DataFrame that hundreds knowledge from a stage;
  • Methods to carry out knowledge and have engineering utilizing the Snowpark DataFrame API; and
  • Methods to deliver a skilled machine studying mannequin into Snowflake as a UDF to attain new knowledge.
See also  Python Cheat Sheet - Great Learning

The duty is the basic buyer churn prediction for an web service supplier, which is a simple binary classification downside. The tutorial begins with a neighborhood setup part utilizing Anaconda; I put in Miniconda for that. It took longer than I anticipated to obtain and set up all of the dependencies of the Snowpark API, however that labored wonderful, and I recognize the way in which Conda environments keep away from clashes amongst libraries and variations.

This quickstart begins with a single Parquet file of uncooked knowledge and extracts, transforms, and hundreds the related info into a number of Snowflake tables.

snowpark python 03 IDG

We’re wanting initially of the “Load Knowledge with Snowpark” quickstart. This can be a Python Jupyter Pocket book operating on my MacBook Professional that calls out to Snowflake and makes use of the Snowpark API. Step 3 initially gave me issues, as a result of I wasn’t clear from the documentation about the place to search out my account ID and the way a lot of it to incorporate within the account area of the config file. For future reference, look within the “Welcome To Snowflake!” e mail on your account info.

snowpark python 04 IDG

Right here we’re checking the loaded desk of uncooked historic buyer knowledge and starting to arrange some transformations.

snowpark python 05 IDG

Right here we’ve extracted and reworked the demographics knowledge into its personal DataFrame and saved that as a desk.

snowpark python 06 IDG

In step 12, we extract and remodel the fields for a location desk. As earlier than, that is carried out with a SQL question right into a DataFrame, which is then saved as a desk.

snowpark python 07 IDG

Right here we extract and remodel knowledge from the uncooked DataFrame right into a Providers desk in Snowflake.

snowpark python 08 IDG

Subsequent we extract, remodel, and cargo the ultimate desk, Standing, which reveals the churn standing and the rationale for leaving. Then we do a fast sanity verify, becoming a member of the Location and Providers tables right into a Be part of DataFrame, then aggregating complete expenses by metropolis and sort of contract for a Outcome DataFrame.

snowpark python 09 IDG

On this step we be part of the Demographics and Providers tables to create a TRAIN_DATASET view. We use DataFrames for intermediate steps, and use a choose assertion on the joined DataFrame to reorder the columns.

Now that we’ve completed the ETL/knowledge engineering part, we will transfer on to the information evaluation/knowledge science part.

snowpark python 10 IDG

This web page introduces the evaluation we’re about to carry out.

snowpark python 11 IDG

We begin by pulling within the Snowpark, Pandas, Scikit-learn, Matplotlib, datetime, NumPy, and Seaborn libraries, in addition to studying our configuration. Then we set up our Snowflake database session, pattern 10K rows from the TRAIN_DATASET view, and convert that to Pandas format.

snowpark python 12 IDG

We proceed with some exploratory knowledge evaluation utilizing NumPy, Seaborn, and Pandas. We search for non-numerical variables and classify them as classes.

snowpark python 13 IDG

As soon as we have now discovered the specific variables, then we determine the numerical variables and plot some histograms to see the distribution.

snowpark python 14 IDG

All 4 histograms.

snowpark python 15 IDG

Given the assortment of ranges we noticed within the earlier display, we have to scale the variables to be used in a mannequin.

snowpark python 16 IDG

Having all of the numerical variables lie within the vary from 0 to 1 will assist immensely after we construct a mannequin.

snowpark python 17 IDG

Three of the numerical variables have outliers. Let’s drop them to keep away from having them skew the mannequin.

snowpark python 18 IDG

If we have a look at the cardinality of the specific variables, we see they vary from 2 to 4 classes.

snowpark python 19 IDG

We choose our variables and write the Pandas knowledge out to a Snowflake desk, TELCO_TRAIN_SET.

Lastly we create and deploy a user-defined perform (UDF) for prediction, utilizing extra knowledge and a greater mannequin.

snowpark python 20 IDG

Now we arrange for deploying a predictor. This time we pattern 40K values from the coaching dataset.

snowpark python 21 IDG

Now we’re establishing for mannequin becoming, on our solution to deploying a predictor. Splitting the dataset 80/20 is customary stuff.

snowpark python 22 IDG

This time we’ll use a Random Forest classifier and arrange a Scikit-learn pipeline that handles the information engineering in addition to doing the becoming.

snowpark python 23 IDG

Let’s see how we did. The accuracy is 99.38%, which isn’t shabby, and the confusion matrix reveals comparatively few false predictions. Crucial function is whether or not there’s a contract, adopted by tenure size and month-to-month expenses.

snowpark python 24 IDG

Now we outline a UDF to foretell churn and deploy it into the information warehouse.

snowpark python 25 IDG

Step 18 reveals one other solution to register the UDF, utilizing session.udf.register() as an alternative of a choose assertion. Step 19 reveals one other solution to run the prediction perform, incorporating it right into a SQL choose assertion as an alternative of a DataFrame choose assertion.

You may go into extra depth by operating Machine Studying with Snowpark Python, a 300-level quickstart, which analyzes Citibike rental knowledge and builds an orchestrated end-to-end machine studying pipeline to carry out month-to-month forecasts utilizing Snowflake, Snowpark Python, PyTorch, and Apache Airflow. It additionally shows outcomes utilizing Streamlit.

Total, Snowpark for Python is superb. Whereas I stumbled over a few issues within the quickstart, they have been resolved pretty shortly with assist from Snowflake’s extensibility help.

I just like the big selection of in style Python machine studying and deep studying libraries and frameworks included within the Snowpark for Python set up. I like the way in which Python code operating on my native machine can management Snowflake warehouses dynamically, scaling them up and down at will to regulate prices and maintain runtimes fairly quick. I just like the effectivity of doing a lot of the heavy lifting contained in the Snowflake warehouses utilizing Snowpark. I like with the ability to deploy predictors as UDFs in Snowflake with out incurring the prices of deploying prediction endpoints on main cloud providers.

Basically, Snowpark for Python offers knowledge engineers and knowledge scientists a pleasant solution to do DataFrame-style programming in opposition to the Snowflake enterprise knowledge warehouse, together with the power to arrange full-blown machine studying pipelines to run on a recurrent schedule.

—

Price: $2 per credit score plus $23 per TB per 30 days storage, customary plan, pay as you go storage. 1 credit score = 1 node*hour, billed by the second. Larger stage plans and on-demand storage are dearer. Knowledge switch expenses are further, and fluctuate by cloud and area. When a digital warehouse is just not operating (i.e., when it’s set to sleep mode), it doesn’t devour any Snowflake credit. Serverless options use Snowflake-managed compute assets and devour Snowflake credit when they’re used.

Platform: Amazon Net Providers, Microsoft Azure, Google Cloud Platform.

Copyright © 2022 IDG Communications, .

Source link

Tags: aceslearningmachinePythonReviewSnowflake
Previous Post

FreshBooks Raises Up to $125M USD Debt Facility

Next Post

Amazon buying iRobot for $1.7B

Next Post
Amazon buying iRobot for $1.7B

Amazon buying iRobot for $1.7B

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Newsletter

Popular Stories

  • Cilium launches eBPF-powered Kubernetes service mesh

    Don’t overengineer your cloud architecture

    0 shares
    Share 0 Tweet 0
  • LG TV Owners Can Get 90 Days Of Stadia Pro For Free

    0 shares
    Share 0 Tweet 0
  • Li Industries Raises $7M in Series A Financing

    0 shares
    Share 0 Tweet 0
  • Redfall is making a 30 minute-long appearance at QuakeCon

    0 shares
    Share 0 Tweet 0
  • New protonic programmable resistors improve AI speed and efficiency

    0 shares
    Share 0 Tweet 0

Cloud Jobs

View 115 Cloud Jobs at Tesla

View 165 Cloud Jobs at Nvidia

View 105 Cloud Jobs at Google

View 135 Cloud Jobs at Amamzon

View 131 Cloud Jobs at IBM

View 95 Cloud Jobs at Microsoft

View 205 Cloud Jobs at Meta

View 192 Cloud Jobs at Intel

Accounting and Finance Hub

Raised Seed, Series A, B, C Funding Round

Get a Free Insurance Quote

Try Our Accounting Service

AI EXPRESS

AI EXPRESS is a news site that covers the latest developments in Artificial Intelligence, Data Analytics, ML & DL, Algorithms, RPA, NLP, Robotics, Smart Homes & Cities, Cloud & Quantum Computing, AR & VR and Blockchains

Categories

  • AI
  • Ai videos
  • Apps
  • AR & VR
  • Blockchain
  • Cloud
  • Computer Vision
  • Crypto Currency
  • Data analytics
  • Esports
  • Gaming
  • Gaming Videos
  • Investment
  • IOT
  • Iot Videos
  • Low Code No Code
  • Machine Learning
  • NLP
  • Quantum Computing
  • Robotics
  • Robotics Videos
  • RPA
  • Security
  • Smart City
  • Smart Home

Quick Links

  • Reviews
  • Deals
  • Best
  • AI Jobs
  • AI Events
  • AI Directory
  • Industries

© 2021 Aiexpress.io - All rights reserved.

  • Contact
  • Privacy Policy
  • Terms & Conditions

No Result
View All Result
  • AI
  • ML
  • NLP
  • Vision
  • Robotics
  • RPA
  • Gaming
  • Investment
  • More
    • Data analytics
    • Apps
    • No Code
    • Cloud
    • Quantum Computing
    • Security
    • AR & VR
    • Esports
    • IOT
    • Smart Home
    • Smart City
    • Crypto Currency
    • Blockchain
    • Reviews
    • Video

© 2021 Aiexpress.io - All rights reserved.