Monday, July 15, 2024

WiMi Announced the Optimization of Artificial Neural Networks Using Group Intelligence Algorithm

Related stories

Synchron Announces Brain Computer Interface Chat Feature Powered by OpenAI

New feature includes AI-driven emotion and language predictions for...

Inspiro Wins Multiple Gold Honors from Globee® Awards

Inspiro, a leading global CX outsourcing company, is excited...

Peak Boosts Business Productivity with General Release of Agentic AI Assistant, Co:Driver

Artificial intelligence company Peak announced the general availability of Co:Driver,...

WiMi Hologram Cloud Inc., a leading global Hologram Augmented Reality (“AR”) Technology provider, announced that it adopted a group intelligence algorithm to optimize the artificial neural network. This algorithm facilitates the process of determining the network structure and the training of the artificial neural network. The group intelligence algorithm is better at finding the optimal connection weights and biases during training compared to traditional algorithms.

The group intelligence algorithm is a meta-heuristic optimization algorithm inspired by observing the behavioral patterns of groups of animals and insects as their environment change. These algorithms use the simple collective behavior of certain groups of biological organisms to generate group intelligence. This allows group intelligence algorithms to solve complex optimization problems using the interaction between groups of artificial search agents and the environment. Group intelligence algorithms can solve different types of optimization problems, including continuous, discrete or multi-objective optimization problems. Therefore, they have a wide range of applications in various fields.

WiMi used a group intelligence algorithm to optimize artificial neural networks to improve the generalization ability of artificial neural networks by optimizing the connection weights, weights, and biases or network structure. The following are the steps of the algorithm:

Determine the structure and parameters of the neural network: Setting and adjusting the structure and parameters of the neural network according to the specific problem, such as the number of layers, the number of neurons in each layer, the activation of functions and so on.

Prepare the training dataset: Selecting an appropriate training dataset for training the neural network.

Initialize dataset: Randomly generating a set of solutions as potential solutions to the problem, representing the initial dataset. In the context of neural network optimization, this can include randomly generating a set of initial weights and bias values as initial solutions for the neural network.

Also Read: Skan launches ProcessGPT™, the first suite of generative AI capabilities for process mining and…

Calculate the fitness: A fitness function is defined based on the nature of the problem and is used to evaluate the quality of each solution. In the context of neural network optimization, this can include calculating the error between the output of the network and the actual label as the fitness.

Search: Updating each solution in the population according to a certain update rule (e.g., an update rule based on modeling the movement step of swarming organisms, such as PSO, AFSA & SFLA) or an update rule set according to some algorithmic mechanism (e.g., ACO). The fitness and stochastic factors of each solution are considered in the update to improve the search efficiency.

Termination conditions: Ensuring that the process satisfies certain termination conditions, such as reaching a preset maximum number of times or finding a satisfying solution.

Testing and evaluation: Testing and evaluating the optimized neural network using a test dataset to verify its performance and generalization ability.

The group intelligence optimization algorithm is a probabilistic stochastic search method, so the optimization result obtained is not necessarily the optimal solution, but usually a better solution. In addition, WiMi will incorporate other techniques such as feature selection and data pre-processing to further improve the performance and generalization of the neural network.

SOURCE: GlobeNewswire


- Never miss a story with notifications

    Latest stories