Trending December 2023 # Extreme Networks: Networking Portfolio Review # Suggested January 2024 # Top 18 Popular

You are reading the article Extreme Networks: Networking Portfolio Review updated in December 2023 on the website We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested January 2024 Extreme Networks: Networking Portfolio Review

Extreme Networks works to provide products that give companies cloud-driven networking solutions. 

As companies produce and rely on more data, networking is the critical backbone for how computers are linked to share data. Extreme Networks is a leader in the sector that is using a range of technologies to design and support modern networks: machine learning (ML), artificial intelligence (AI), analytics, and automation.

See below to learn all about Extreme Networks and where they stand in the networking space: 

The networking market has grown across many sub-sectors, including both software-defined networking (SDN) and cloud-driven networking.

The global SDN market is estimated to be $18.5 billion in 2023 and $36.2 billion by 2026, according to StrategyR.

The global cloud-driven networking market was valued at estimated $3.32 billion in 2023 and is projected to reach $14.61 billion in 2027, according to Fortune Business Insights. 

In fiscal year 2023, Extreme Networks’ net revenue was over $1 billion, up 6% year over year.

“We have seen enormous opportunity and growth within the cloud managed wired and wireless networking industry,” says Will Hopson, a senior territory manager, Extreme Networks.

In Extreme Networks’ 2023 annual report, the company highlights that through cloud-driven networking and automation, network administrators can scale how they provide “productivity, availability, accessibility, manageability, security, and speed, regardless of how distributed the network is.”

For more: Top Enterprise 5G Networks

One Network


Wired platforms

Universal Switches

ExtremeXOS Switches

VSP Switches

Wireless platforms 

Universal APs (WiFi 6/6E APs)

Indoor/outdoor APs

Wireless controllers



Extreme SD-WAN


Fabric services

Fabric Connect 

One Cloud


ExtremeCloud IQ

ExtremeCloud IQ-Site Engine

ExtremeCloud IQ-Controller

Business insights

Extreme Analytics

Extreme Location 

Extreme Guest 

One Extreme

Professional services

Maintenance services

Customer success

Reduced security risk: with hyper-segmentation of network 

Increased savings: Extreme Networks reports traditional network infrastructure, operational, and admin savings up to 10% 

Improved support and problem resolution: Extreme reports that 94% of Extreme customer help calls are resolved by the network specialist who answers the call and co-locates support with engineering

Enhanced network intelligence: through the ability to have a 360-degree view of the network, users, devices, and applications

Use latest software: Universal hardware platforms allow customers to evolve deployment models by changing the software while using existing hardware

Jackpot Junction Casino Hotel in Minnesota was in need of network management for their wireless networks and devices that are secure and compliant for gaming regulations. With the risk of outages and downtime, Jackpot Junction’s IT team needed to keep the casino up and running.

Nick Potter, director of IT, was aware the casino needed a new approach to their network management, including the need to automate processes and for visibility of their pain points to help the IT team handle the network demand and support guest services.

Potter decided that Extreme Networks could help with their problem. Adding the network management solution increases network uptime and enables the casino to keep up with the initiatives, including self-service guest kiosks. By future-proofing the latest gaming devices, the staff was also able to get easier access to tablets for the hotel and restaurant.

Jackpot Junction’s networking solution from Extreme included ExtremeSwitching, FabricConnect, and ExtremeCloud IQ. 

“It’s just kind of magic how you move a device and the Fabric recognizes it and routes it,” Potter says.

Jackpot Junction’s networking implementation by Extreme Networks also allowed the casino to support the Lower Sioux Native American community nearby with networking, including high-speed, secure connectivity for their government buildings, community center and health care facility.

Dynamic role-based policies:

support scalable policy mechanism on wired and wireless devices for users, devices, and applications through the network

Application hosting:

run onboard applications alongside the switch operating system (OS), without impacting performance; provide network insight through on-board analytics applications; enable new network applications without the need for a separate hardware device

Artificial intelligence (AI)/machine learning (ML)-driven insights:

ongoing integration with ExtremeCloud IQ, offers ability to fine-tune the network, before issues become impact service


new networking ideas coming out of the company, according to various customers

For more: 5 Top Private 5G Trends

Users give mostly positive reviews to several Extreme Networks products, which are partly driven by AI:

PeerSpot: 4.1 out of 5

PeerSpot: 4.3 out of 5

CRN 2023 “Networking Products of the Year”:

due to their “enhance insight, visibility, and control” in networking

Gartner 2023 “Magic Quadrant for Enterprise Wired and Wireless LAN Infrastructure”:

for their operations in distributed environments and delivering in-demand resources and capabilities 

CRN 2023 “Top 10 Coolest New Networking Products”:

due to a subsection of their ExtremeCloud IQ platform called ExtremeCoPilot

CRN 2023 “Data Center 50”:

one of the key players in the data center market.

2023 “IT Champion, Networking” by Computerwoche:

for network infrastructure

Extreme Networks helps companies establish their networks at the software level as well as the hardware level with switches and routers. With reported benefits such as reduced security risk and network-based savings, Extreme Network is a leading player in the growing networking market. 

“Our world is more reliant on technology than ever before, and as a technology company it is our duty to ensure we are making a positive impact,” says Katy Motiey, chief admin and sustainability officer, Extreme Networks.

A company looking for networking solutions to build out or upgrade a network can consider Extreme Networks as a provider with focused options.

For more products: NordLayer: Network Security Review

You're reading Extreme Networks: Networking Portfolio Review

Tech Giants Move Toward Social Networks

SANTA CLARA — As the Facebook generation becomes a bigger part of the enterprise, companies face the challenge of implementing increasingly familiar social network technologies in concert with legacy systems. That was one of the themes expressed by a panel of leading vendors here at the Collaborate 2.0 conference sponsored by SD Forum.

“In IT, a user is a login; on Facebook, a user is a profile with a picture and other details. That’s pretty empowering. End users are driving change,” said Chuck Ganapathi, senior vice president of products at chúng tôi (NYSE: CRM).

Photo: David Needle

The next generation of IT applications may well leverage something like Facebook’s look and feel for a logical reason. “Facebook has over 300 million users now and is on the way to training half a billion people on what is really a pretty sophisticated application — there’s a lot going on there,” Ganapathi said.

And as these collaborative, social network technologies inevitably spread, Ganapathi said a key issue to be resolved is IT control versus user power.

“Your employees are going to download Yammer because it’s a better way to communicate,” he said. “Getting to a happy medium is going to be very important in the enterprise.”

But it’s also not just about allowing blogs or adding a corporate wiki, he said.

“There are lots of tools today to make the conversations in your company more social, but what about the data that’s sitting there in Excel, in ERP, in e-mail? How you make that data social is going to be key.”

Microsoft has lit a FUSE

Matt Thompson, general manager of Microsoft’s (NASDAQ: MSFT) developer and platform evangelism in Silicon Valley, said the software giant is ready to make moves in the social network/collaboration space beyond its already successful SharePoint software. He said Microsoft Research has about 25 different social collaboration projects they’ve put under one group called FUSE Labs.

“You’re going to see some innovative stuff under social collaboration,” he said. “We have a vision for where this is going in the future. Video and telepresence is a key piece. And you’ll see a lot more interoperability as well. This can’t be a single stack.”

Thompson noted that Facebook execs have said they have no plans to develop a private version or social graph for the enterprise, though they haven’t ruled out working with partners — one of which is Microsoft, which owns a stake in the social networking phenom.

“Internal IT is a very fertile ground to disrupt,” Thompson said. “The key is there won’t be multiple social graphs. I don’t think Facebook realizes the big role they have.”

That said, Thompson gave Facebook big props for opening up its platform to let users take their Facebook identity with them when visiting other sites.

Thompson also took note of Twitter, which he said he loves. Like Facebook Connect, he said a huge percentage of users use the service without chúng tôi as a starting point.

“They’re delivering collaboration at 140 characters wherever the user may be,” he said.

Cisco and the future of work

Like Microsoft, Cisco (NASDAQ: CSCO) is investing in multiple social network and collaborative areas, including a portfolio of nine businesses in the incubation stage.

Photo: David Needle

“Our thesis is that we’re on the cusp of a big transformation like the Internet in the ’90s around the future of work, putting people and productivity back into the equation,” said Didier Moretti, vice president of business incubation in Cisco’s Emerging Technologies group.

IBM (NYSE: IBM) is another company very much on the social network bandwagon. Roosevelt Bynum, who manages the company’s developerWorks Web applications, said the My developerWorks community site is a “like Facebook for geeks.”

(Update: An earlier version of this article incorrectly referred to Cisco as the development partner that helped Starbucks create the MyStarbucksIdea site. The partner was actually chúng tôi which powers the site.

Article courtesy of chúng tôi

How To Begin A Conversation While Networking

Being able to communicate effectively at networking events is key. It is important to have opening lines. This is just one way to strike up a conversation with someone new. This is how to make a business connection.

Why Conversation Starters are so Important

A conversation starter is a way to establish common ground with another person. This small talk can help you to get more B2B networking. A conversation can spark genuine interest in your product if it’s done well.

Sales techniques are the best conversation starters.

Tips to Start a Conversation at Networking Events

You need to be able to strike up a conversation at these events. Here are seven things that will make a great first impression.

A Conversation Starts with the Right Body Language

Engaging conversations are made possible by non-verbal cues. Research shows that stance is crucial. Don’t be timid. You will appear tentative. Your body language complements your conversation.

Good conversations require closeness. It is important to be close enough for you to have a conversation but not too close.

Finding Common Ground

Talking to someone is a great way to get over the awkward silence. To get the conversation started, talk to the other person.

Keep your eyes open for great conversations

Engaging in conversations are only possible if you keep your eyes open. It shows genuine interest if you keep your eyes open. This is a great way for you to get started in a conversation.

Asking Follow Up Questions

Use great conversation starters to build meaningful connections. Next, ask follow-up questions. Ask the other person about his or her experience and their background.

A great conversation starts with mutual interest.

Avoid Conversations on Conversational Topics

Avoid being a bad conversation starter. These topics can help you avoid social anxiety.

Current events.


complaints about work

Don’t discuss things that you don’t like. This is a bad way to start a conversation. Do not jump into deep topics immediately. You should leave the peer-reviewed studies alone.

Face to Face Networking Events: How to start a conversation

Instead of focusing on the best conversation starters, it makes more sense. Here are 8 tips to spark a conversation with someone you meet face-to-face.

1 Introduce Yourself

This is a great way to start a conversation. A quick description is the best way to get someone talks. This is the beginning of a normal conversation.

2. Go For A Walk

Walking together can lead to deep conversations. This is especially important when you are on your way to the next panel discussion.

3. Mention A Favourite Food

One thing you will have in common with everyone at an event is the food. It’s the food. It can be a great place to have a good conversation.

4. Share a Light Headline

Here are some tips to help you use this one. It’s possible to share news about a team. Conversation starters that aren’t political can be great. Another great way to start a conversation is by mentioning a positive event.

5. Compliment the Other Person

These are great conversation starters. If it is specific, a sincere compliment can work.

6. Mention a Detail 7. Keep You Right-hand Free

Every conversation should include a handshake. When you meet a new conversation partner, keep your right hand at the ready. Most people will admit that they were interested in hearing someone start conversations like this.

8. Begin With the Weather

Pay attention to the outside world. You can start a conversation by looking at real life situations, such as the weather. It is possible to refer back to a previous conversation.

How to start a conversation during Online Networking Events

Conversation flow naturally occurs when two people meet face-to-face with someone who can help them. More people are looking for conversation starters, small talk, and business tips to facilitate online meetings. These are eight ways to get started at an online meeting. These are small talk tips that can be used to engage your conversation partner.

9. Do Some Research 10. Make Sure To Engage

Online events can include discussions or even quizzes. Open ended questions such as. It is possible to start a conversation by asking about what the attendees do with their spare time. Don’t get them to talk about controversial topics. Talk about your hobbies with the other person.

11. Ask A Day to Day Life Question

Before a meeting begins, there’s still time to talk and have a conversation with someone. To start a conversation about personal information, ask a question. Conversation starters that work are people’s families. People love to talk about their closest friends.

12. Ask About Social Events

You can have a good conversation about a favourite movie. You can also get people talking about music. Conversation starters are great with the arts. Particularly if the person owns a business that is focused in this area.

13. Ask Them To Describe Their Mood

Later in the event, you can tap into their expertise. It’s a great way to make new friends and share your mood. Talking about their social lives is a conversation starter. You can also share your own writings about how you feel on a Zoom board. It’s not a conversation, but a great way to break the ice.

This conversation is a great way to get to know someone. This conversation encourages the other person talk about their own lives.

14. Register on the Community Board

Before the event begins, find a conversational partner. You can have a text conversation with someone by joining a community board. The boards are a great place to “talk” via messaging.

15. Post on Your Website Pre Event

Post a notice that you have registered for the event, and are looking to connect with others. You can also search for people to connect with on Twitter, LinkedIn, and Facebook.

16. Take Advantage of Real Time Messaging

To get more information from another person, use real-time messaging to present topics. This is a way to build trust and start a conversation. It shows the other person that you value their opinion. It’s a great way to start a conversation.

These Conversation Starters will help you have a great networking event

This gives you an idea of how to begin a conversation. It can be a great idea to talk with someone at an online or brick-and-mortar event. Perhaps you are looking for a job. You can also talk to someone about their needs in order to make a sale.

Whatever your goal, a conversation can help you achieve it. Your business will benefit more if you have more people talking at events. If you are having trouble doing this, consider networking for introverts.

Approaching Regression With Neural Networks Using Tensorflow

This article was published as a part of the Data Science Blogathon.


Every supervised machine learning technique basically solves either classification or regression problems. Classification is a kind of technique where we classify the outcome into distinct categories whose range is usually finite. Whereas a regression technique involves predicting a real number whose range is infinite. Some examples of regression are predicting the price of a house given the number of bedrooms and floors in it or predicting the product rating given specifications of the product etc.

Neural networks are one of the most important algorithms that have profound applications in computer vision and natural language processing domains. Now let’s apply these neural networks on tabular data for a regression task. We will use Tensorflow and Keras deep learning library to build and train our neural network.

Let’s get started…

Reference I

About the Dataset

The dataset we are using to train our model is the Auto MPG dataset. This dataset consists of different features of commonly used automobiles during the 1980s and 90s. This includes attributes like the number of cylinders, horsepower, the weight of the car etc. Using these features we should predict the miles per gallon or mileage of the automobile. Thus making it a multi-variate regression task. More information about the dataset can be found here.

Getting Started

Let’s get started by importing the required libraries and downloading the dataset.

import pandas as pd import matplotlib.pyplot as plt import tensorflow as tf from tensorflow import keras from tensorflow.keras import layers from tensorflow.keras import Sequential from tensorflow.keras import optimizers

Now let’s download and parse the dataset using pandas into a data frame.

column_names = [“MPG”, “Cylinders”, “Displacement”, “Horsepower”, “Weight”, “Acceleration”, “Model Year”, “Origin”] df = pd.read_csv(url, names=column_names, sep=” “,

We have directly downloaded the dataset using the pandas read_csv method and parsed it to a data frame. We can check the contents of the data frame for more details.


Now let’s preprocess this data frame to make it easier for our deep learning model to understand. Before that, it is a good practice to make a copy of the original dataset and then work on it.

# Create a copy for further processing dataset = df.copy() # Get some basic data about dataset print(len(dataset)) Data Preprocessing

Now let’s preprocess our dataset. First, let’s check for null values since it is important to handle all null values before feeding the data into our model (If data consists of null values, the model won’t be able to learn resulting in null results).

# Check for null values dataset.isna().sum()

Since we encountered some null values in the ‘Horsepower’ column, we can handle them easily using the Pandas library. We can usually drop all the null values using dropna() method or also fill those null values with some value like the mean of the entire column etc. Let’s use fillna() method to fill the null values.

# There are some na values. We can fill or remove those rows. dataset['Horsepower'].fillna(dataset['Horsepower'].mean(), inplace=True) dataset.isna().sum()

Now we can see that there are no null values present in any columns. According to the dataset, the ‘Origin’ column is not numeric but categorical i.e. each number represents each country. So let’s encode this column by using the pandas get_dummies() method.

dataset['Origin'].value_counts() dataset['Origin'] = dataset['Origin'].map({1: 'USA', 2: 'Europe', 3: 'Japan'}) dataset = pd.get_dummies(dataset, columns=['Origin'], prefix='', prefix_sep='') dataset.head()

For the above output, we can observe that the single ‘Origin’ column is replaced by 3 columns with the names of the countries with 1 or 0 representing the origin country.

Now, let’s split the dataset into training and testing/validation set. This is useful to test the effectiveness of the model i.e. how good the model generalises on the unseen data.

# Split the Dataset and create train and test sets train_dataset = dataset.sample(frac=0.8, random_state=0) test_dataset = dataset.drop(train_dataset.index) ######################################################## # Separate labels and features train_features = train_dataset.drop(["MPG"], axis=1) test_features = test_dataset.drop(["MPG"], axis=1) train_labels = train_dataset["MPG"] test_labels = test_dataset["MPG"]

We can check some basic statistics about the data using the pandas describe() function.

# Let's check some basic data about dataset train_dataset.describe().transpose()

We can now proceed with the next data preprocessing step: Normalization.

Data normalization is one of the basic preprocessing steps which will convert the data into a format that the model can easily process. In this step, we scale the data such that the mean will be 0 and the standard deviation will be 1. We will use the sci-kit learn library to do the same.

# But we can also apply normalization using sklearn. from sklearn.preprocessing import StandardScaler feature_scaler = StandardScaler() label_scaler = StandardScaler() ######################################################## # Fit on Training Data, 1)) ######################################################## # Transform both training and testing data train_features = feature_scaler.transform(train_features.values) test_features = feature_scaler.transform(test_features.values) train_labels = label_scaler.transform(train_labels.values.reshape(-1, 1)) test_labels = label_scaler.transform(test_labels.values.reshape(-1, 1))

We do the fit and transform separately because the fit will learn the data representation of the input data and transform will apply the learnt representation. In this way, we will be able to avoid looking at the representation/statistics of the test data.

Now let’s get into the most exciting part of the process: Building our neural network.

Creating the Model

Let’s create our model by using the Keras Sequential API. We can stack the required layers into the sequential model and then define the required architecture. Now let’s create a basic fully connected dense neural network for our data.

# Now let's create a Deep Neural Network to train a regression model on our data. model = Sequential([ layers.Dense(32, activation='relu'), layers.Dense(64, activation='relu'), layers.Dense(1) ])

We have defined the sequential model by adding 2 dense layers with 32 and 64 units respectively, both using the Rectified Linear Unit activation function i.e. Relu. Finally, we are adding a dense layer with 1 neuron representing the output dimension i.e. a single number. Now let’s compile the model by specifying the loss function and optimizer.


For our model, we are using the RMSProp optimizer and the mean squared error loss function. These are important parameters for our model because the optimizer defines how our model will be improved and loss defines what will be improved.

It is recommended to try updating/playing with the above model by using different layers or changing the optimizers and loss function and observing how the model performance improves or worsens.

Now we are ready to train the model !!

Model Training and Evaluation

Now let’s train our model by specifying the training features and labels for 100 epochs. We can also pass the validation data to periodically check how our model is performing.

# Now let's train the model history =, x=train_features, y=train_labels, validation_data=(test_features, test_labels), verbose=0)

Since we specified the verbose=0, we won’t get any model training log for every epoch. We can use the model history to plot how our model has performed during the training. Let’s define a function for the same.

# Function to plot loss def plot_loss(history): plt.plot(history.history['loss'], label='loss') plt.plot(history.history['val_loss'], label='val_loss') plt.ylim([0,10]) plt.xlabel('Epoch') plt.ylabel('Error (Loss)') plt.legend() plt.grid(True) ######################################################## plot_loss(history)

Loss Plot

Now we can see that our model has been able to achieve the least loss during the training. This is due to the fully connected layers through which our model was able to easily detect the patterns in the dataset.

Now finally let’s evaluate the model on our testing dataset.

# Model evaluation on testing dataset model.evaluate(test_features, test_labels)

The model performed well on the testing dataset! We can save this model and use the saved model to predict some other data.

# Save model"trained_model.h5")

Now, we can load the model and perform predictions.

# Load and perform predictions saved_model = models.load_model('trained_model.h5') results = saved_model.predict(test_features) ######################################################## # We can decode using the scikit-learn object to get the result decoded_result = label_scaler.inverse_transform(results.reshape(-1,1)) print(decoded_result)

Oh great !! Now we can see the predictions i.e. decoded data given the input features into our model.


In the blog on regression with neural networks, we have trained the baseline regression model using deep neural networks on a small and simple dataset and predicted the required output. As a challenge for learning, it is recommended to train more models on bigger datasets by changing the architecture and trying different hyperparameters such as optimizers and loss functions. Hope you liked my article on regression with neural networks.

References About the Author

I’m Narasimha Karthik, Deep Learning Practioner. I’m a final-year undergraduate student at PES University. Currently working with Computer Vision and NLP. Experience in working with Tensorflow and PyTorch frameworks. You can contact me through LinkedIn and Twitter.

Read more articles on our blog.

The media shown in this article is not owned by Analytics Vidhya and are used at the Author’s discretion. 


Security Sieves: Misused Technology Leaving Networks Vulnerable

Despite the big money IT executives are spending to protect their companies from security breaches, industry analysts and security consultants say most are misusing the technology they already have installed.

“The problem is pretty widespread,” says Paul Robertson, director of risk assessment at Herndon, Va.-based TruSecure Corp. “We did a study that showed that about 70% of companies with firewalls are vulnerable to attack because they’re not configured properly or they weren’t deployed correctly. And my gut says that’s actually a low percentage.”

Robertson and other analysts say the problem is only growing as users demand more features, layoffs dwindle the number of trained IT and security workers, the quick turnaround of new software versions leaves little time to master new features, and companies focus on making things easy to access at the expense of making their information secure.

“It is a problem,” says Richard Power, editorial director of the Computer Security Institute. “Most companies don’t have a dedicated security staff. Instead, it’s just part of some over-worked administrator’s job. That’s simply not enough and it’s leading to some real problems.”

Security Sidebar

How To Get The Most Out Of Your Security Software: Tips on how companies can make sure they’re properly using the security technology they already have.

And those problems are leaving many companies unprotected from attack.

Faulty Firewall Configurations

A firewall, for instance, is only as beneficial as its configuration, according to TruSecure’s Robertson, who admittedly has a stake in convincing companies to test their security setups and to hire outside help. The mistake that many companies make is to open their firewalls wide by turning on access to protocols and basically permitting everything inside to go out through and everything outside to come in.

“The focus of the Internet is all about connectivity and users want more and more connectivity,” says Robertson. “IT has to balance that with security. The more you let through, the less you’re protected.”

Robertson also points out that companies should think of VPNs as communication devices and not necessarily security devices. And users shouldn’t be allowed to be connected to the Internet at the same time as they’re connected to the internal network. Robertson also points out that users should never be allowed to connect to the network from their own personal computers at home, since the company then can’t control how many family members or friends have access to it.

“These are things we think about continually,” says Deb Parks, infrastructure analyst at John Deere Ottumwa. “They’re very particular here about configurations and setup… You’ve got to have people on hand who have the knowledge to keep on top of it all. The money for it needs to be there. The technology needs to be there and the people who know how to really use it need to be in place.”

It Takes More Than Technology

Michael Rasmussen, director of research and information security at Giga Information Group, agrees technology alone isn’t the answer to developing strong network security.

“Don’t throw products at it until you know what you’re trying to protect — until you know exactly how they work,” says Rasmussen. “It’s complex. You’ve got people. You’ve got policies. You’ve got training. You’ve got to work them all together.”

But Mike Riley, chief scientist at R.R. Donnelley & Sons Co. a Chicago-based printing company, says the technology craze that preceded the current economic downturn infected many IT executives with a zeal for the latest and greatest devices and software that overrode their staffs’ ability to technically keep pace.

“In the boom days, technology was developing so rapidly and money was free flowing,” says Riley. “It used to be, ‘Let’s get the equipment in here and look at business models later.’ They would have just finished installing, configuring and testing the latest release of something when the next release would be out. Not enough focus was put on learning the new features and if you don’t know the baseline features, you’ll never figure out the ones that are going to be built on top of them.”

Hyperparameter Tuning Of Neural Networks Using Keras Tuner

This article was published as a part of the Data Science Blogathon


In neural networks we have lots of hyperparameters, it is very hard to tune the hyperparameter manually. So, we have Keras Tuner which makes it very simple to tune our hyperparameters of neural networks. It is just like that Grid Search or Randomized Search that you have seen in machine learning.

In this article, you will learn about How to tune your hyperparameters of a neural network using Keras Tuner, we will start with a very simple neural network and then we will do hyperparameter tuning and compare the results. You will learn about everything you need to know about Keras Tuner.

Developing deep learning models is an iterative process, You start with an initial architecture then reconfigure until you get a model that can be trained efficiently in terms of time and compute resources.

These settings that you adjust are called hyperparameters, you get the idea, you write code and see the performance, and again you to the same process until you have good performance.

So, there is a way where you can adjust the setting of your neural networks which is called hyperparameters and the process of finding a good set of hyperparameters is called hyperparameter tuning.  

Hyperparameter tuning is a very important part of the building, if not done, then it might cause major problems in your model like taking lots of time, useless parameters, and a lot more.

Hyperparameters are usually two types:-

Model-based hyperparameters:- These types of hyperparameters include, number of hidden layers, neurons, etc.

Algorithms based:- These types influence the speed as well as efficiencies, like learning rate in Gradient Descent, etc.

The number of hyperparameters can increase dramatically for more complex models, and tuning them manually can be quite challenging.

The benefit of the Keras tuner is that it will help in doing one of the most challenging tasks, i.e. hyperparameter tuning very easily in just some lines of code.

Keras Tuner

Keras tuner is a library for tuning the hyperparameters of a neural network that helps you to pick optimal hyperparameters in your neural network implement in Tensorflow.

 For installation of Keras tuner, you have to just run the below command,

pip install keras-tuner

But wait!, Why do we need Keras tuner? 

So, the answer is hyperparameters plays an important role in developing a good model, it can make large differences, it will help you to prevent overfitting, it will help you in having good bias and variance trade-off, and a lot more.

Tuning our hyperparameter using Keras Tuner

First, we will develop a baseline model, and then we will use Keras tuner for developing our model. I will be using Tensorflow for implementation.

Step:-  1 ( Download and Prepare the dataset )  from tensorflow import keras # importing keras (x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data() # loading the data using keras datasets api x_train = x_train.astype('float32') / 255.0 # normalize the training images x_test = x_test.astype('float32') / 255.0 # normalize the testing images Step:- 2 ( Developing the baseline model ) 

Now, we will build our baseline neural network using the mnist dataset that will help in recognizing the digits, so let’s build a deep neural network.

model1 = keras.Sequential() model1.add(keras.layers.Flatten(input_shape=(28, 28))) # flattening 28 x 28 model1.add(keras.layers.Dense(units=512, activation='relu', name='dense_1')) # you have 512 neurons with relu activation model1.add(keras.layers.Dropout(0.2)) # we added a dropout layer with the rate of 0.2 model1.add(keras.layers.Dense(10, activation='softmax')) # output layer, where we have total 10 classes Step:- 3 ( Compiling and Training the model ) 

Now, we have built our baseline model, now it’s time to compile our model and train the model, we will use Adam optimizer with a learning rate of 0.0, for training we will run our model for 10 epochs, with the validation split of 0.2.

loss=keras.losses.SparseCategoricalCrossentropy(), metrics=[‘accuracy’]), y_train, epochs=10, validation_split=0.2) Step:- 4 ( Evaluating our model )  

So, now we have trained, now we will evaluate our model on the test set, to see the model performance.

model1_eval = model.evaluate(img_test, label_test, return_dict=True) Tuning your model using Keras Tuner  Step:- 1 (Importing the libraries)  import tensorflow as tf import kerastuner as kt Step:- 2 (Building the model using Keras Tuner) 

Now, you will set up a Hyper Model (The model you set up for hypertuning is called a hypermodel), we will define your hypermodel using the model builder function, which you can see in the function below returns the compiled model with tuned hyperparameters.

In the below classification model, we will fine-tune the model hyperparameters which are several neurons as well as the learning rate of the Adam optimizer.

def model_builder(hp): ''' Args: hp - Keras tuner object ''' # Initialize the Sequential API and start stacking the layers model = keras.Sequential() model.add(keras.layers.Flatten(input_shape=(28, 28))) # Tune the number of units in the first Dense layer # Choose an optimal value between 32-512 hp_units = hp.Int('units', min_value=32, max_value=512, step=32) model.add(keras.layers.Dense(units=hp_units, activation='relu', name='dense_1')) # Add next layers model.add(keras.layers.Dropout(0.2)) model.add(keras.layers.Dense(10, activation='softmax')) # Tune the learning rate for the optimizer # Choose an optimal value from 0.01, 0.001, or 0.0001 hp_learning_rate = hp.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4]) loss=keras.losses.SparseCategoricalCrossentropy(), metrics=['accuracy']) return model

In the above code, here are some notes:-

Int() method to define the search space for the Dense units. This allows you to set a minimum and maximum value and the step size when incrementing between these values.

Choice() method for the learning rate. This allows you to define discrete values to include in the search space when hypertuning.

Step:-3) Instantiating the tuner and tuning the hyperparameters

You will HyperBand Tuner, It is an algorithm developed for hyperparameter optimization. It uses adaptive resource allocation and early-stopping to quickly converge on a high-performing model. You can read more about this intuition here.

But the basic algorithm is below in the picture, if you are not able to understand, kindly ignore it and move forward. It’s a large topic that requires another blog.

Hyperband determines the number of models to train in a bracket by computing 1 + log_factor(max_epochs) and rounding it up to the nearest integer.

# Instantiate the tuner tuner = kt.Hyperband(model_builder, # the hypermodel objective='val_accuracy', # objective to optimize max_epochs=10, factor=3, # factor which you have seen above directory='dir', # directory to save logs project_name='khyperband') # hypertuning settings tuner.search_space_summary() Output:- # Search space summary # Default search space size: 2 # units (Int) # {'default': None, 'conditions': [], 'min_value': 32, 'max_value': 512, 'step': 32, 'sampling': None} # learning_rate (Choice) # {'default': 0.01, 'conditions': [], 'values': [0.01, 0.001, 0.0001], 'ordered': True} Step:- 4 ( Searching the best hyperparameter )  stop_early = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=5) # Perform hypertuning, y_train, epochs=10, validation_split=0.2, callbacks=[stop_early]) best_hp=tuner.get_best_hyperparameters()[0] Step:- 5 ( Rebuilding and Training the Model with optimal hyperparameters )  # Build the model with the optimal hyperparameters h_model = h_model.summary(), x_test, epochs=10, validation_split=0.2)

Now, you can evaluate this model, 

h_eval_dict = h_model.evaluate(img_test, label_test, return_dict=True) Comparison of with and Without Hyperparameter Tuning

Baseline Model Performance:- 

BASELINE MODEL: number of units in 1st Dense layer: 512 learning rate for the optimizer: 0.0010000000474974513 loss: 0.08013473451137543 accuracy: 0.9794999957084656 HYPERTUNED MODEL: number of units in 1st Dense layer: 224 learning rate for the optimizer: 0.0010000000474974513 loss: 0.07163219898939133 accuracy: 0.979200005531311

If you have seen the timing of training of your baseline model that is more than this hyperparameter tuned model because it has lesser neurons, so it is faster.

The Hyperparameter model is more robust, you can see the loss of your baseline model and see the loss of the hyper tuned model, so we can say that is a more robust model.

End Notes

Thanks for reading this article, I hope that you found this article very helpful and you will implement the Keras tuner in your neural network to get better neural nets.

About the Author

Ayush Singh

I am a 14-year-old learner and machine learning and deep learning practitioner, working in the domain of Natural Language Processing, Generative Adversarial Networks, and Computer Vision. Also, I make videos on machine learning, deep learning, Gans on my youtube channel Newera. I am also a competitive coder but still practicing all the techs and a passionate learner and educator. You can connect me on Linkedin:- Ayush Singh

The media shown in this article are not owned by Analytics Vidhya and are used at the Author’s discretion.


Update the detailed information about Extreme Networks: Networking Portfolio Review on the website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!