Friday, December 27, 2024

WiMi Announced the Optimization of Artificial Neural Networks Using Group Intelligence Algorithm

Related stories

Pollo AI Unveils Game-Changing Video-to-Video Feature

Pollo AI, a leading AI video generator released by...

SKF Enhances Customer Support with AI Tool

SKF Product assistant, a new search assistant for finding...

PFN, Mitsubishi, and IIJ Partner for AI Cloud

Preferred Networks (PFN), Mitsubishi Corporation (MC) and Internet Initiative Japan (IIJ) have on...

KAYTUS NextGen Server: High Performance with Liquid Cooling for AI

KAYTUS, a leading IT infrastructure provider, announced its new...

Zenity Launches AI Security Solution for Microsoft Fabric Skills

Zenity, the leader in securing Agentic AI everywhere, announced...
spot_imgspot_img

WiMi Hologram Cloud Inc., a leading global Hologram Augmented Reality (“AR”) Technology provider, announced that it adopted a group intelligence algorithm to optimize the artificial neural network. This algorithm facilitates the process of determining the network structure and the training of the artificial neural network. The group intelligence algorithm is better at finding the optimal connection weights and biases during training compared to traditional algorithms.

The group intelligence algorithm is a meta-heuristic optimization algorithm inspired by observing the behavioral patterns of groups of animals and insects as their environment change. These algorithms use the simple collective behavior of certain groups of biological organisms to generate group intelligence. This allows group intelligence algorithms to solve complex optimization problems using the interaction between groups of artificial search agents and the environment. Group intelligence algorithms can solve different types of optimization problems, including continuous, discrete or multi-objective optimization problems. Therefore, they have a wide range of applications in various fields.

WiMi used a group intelligence algorithm to optimize artificial neural networks to improve the generalization ability of artificial neural networks by optimizing the connection weights, weights, and biases or network structure. The following are the steps of the algorithm:

Determine the structure and parameters of the neural network: Setting and adjusting the structure and parameters of the neural network according to the specific problem, such as the number of layers, the number of neurons in each layer, the activation of functions and so on.

Prepare the training dataset: Selecting an appropriate training dataset for training the neural network.

Initialize dataset: Randomly generating a set of solutions as potential solutions to the problem, representing the initial dataset. In the context of neural network optimization, this can include randomly generating a set of initial weights and bias values as initial solutions for the neural network.

Also Read: Skan launches ProcessGPT™, the first suite of generative AI capabilities for process mining and…

Calculate the fitness: A fitness function is defined based on the nature of the problem and is used to evaluate the quality of each solution. In the context of neural network optimization, this can include calculating the error between the output of the network and the actual label as the fitness.

Search: Updating each solution in the population according to a certain update rule (e.g., an update rule based on modeling the movement step of swarming organisms, such as PSO, AFSA & SFLA) or an update rule set according to some algorithmic mechanism (e.g., ACO). The fitness and stochastic factors of each solution are considered in the update to improve the search efficiency.

Termination conditions: Ensuring that the process satisfies certain termination conditions, such as reaching a preset maximum number of times or finding a satisfying solution.

Testing and evaluation: Testing and evaluating the optimized neural network using a test dataset to verify its performance and generalization ability.

The group intelligence optimization algorithm is a probabilistic stochastic search method, so the optimization result obtained is not necessarily the optimal solution, but usually a better solution. In addition, WiMi will incorporate other techniques such as feature selection and data pre-processing to further improve the performance and generalization of the neural network.

SOURCE: GlobeNewswire

Subscribe

- Never miss a story with notifications


    Latest stories

    spot_img