7 Best Christmas Tree Stands in 2022

Image
Believe it or not, a Christmas tree won't stay upright on its own. Instead, you need a stable Christmas tree stand that can accommodate the type and size of tree you have. We researched dozens of the best Christmas tree stands to help you find the right one for your needs, whether you have a real tree, an artificial tree, a small tree, or a behemoth. The stands in our guide have a track record of durability, performance, and easy setup. We also outline the size and type of tree each stand is meant for. Check out our guide to the best Christmas tree skirts once you've chosen the right stand for your tree. The best Christmas tree stands in 2022 Best Christmas tree stand overall: Krinner Tree Genie Christmas Tree Stand, available at Amazon, $82.79 The German-engineered Krinner Tree Genie Christmas Tree Stand is easy to set up in a couple of minutes and keeps trees up to 12 f...

AI uses artificial sleep to learn new task without forgetting the last

Many AIs can only become good at one task, forgetting everything they know if they learn another. A form of artificial sleep could help stop this from happening

Technology 10 November 2022

Person sleeping

AIs may need to sleep too

Shutterstock/Ground Picture

Artificial intelligence can learn and remember how to do multiple tasks by mimicking the way sleep helps us cement what we learned during waking hours.

“There is a huge trend now to bring ideas from neuroscience and biology to improve existing machine learning – and sleep is one of them” says Maxim Bazhenov at the University of California, San Diego.

Many AIs can only master one set of well-defined tasks –  they can’t acquire additional knowledge later on without losing everything they had previously learned. “The issue pops up if you want to develop systems which are capable of so-called lifelong learning,” says Pavel Sanda at the Czech Academy of Sciences in the Czech Republic. Lifelong learning is how humans accumulate  knowledge to adapt to and solve future challenges.

Bazhenov, Sanda and their colleagues trained a spiking neural network – a connected grid of artificial neurons resembling the human brain’s structure – to learn two different tasks without overwriting connections learned from the first task. They accomplished this by interspersing focused training periods with sleep-like periods.

The researchers simulated sleep in the neural network by activating the network’s artificial neurons in a noisy pattern. They also ensured that the sleep-inspired noise roughly matched the pattern of neuron firing during the training sessions – a way of replaying and strengthening the connections learned from both tasks.

The team first tried training the neural network on the first task, followed by the second task, and then finally adding a sleep period at the end. But they quickly realised that this sequence still erased the neural network connections learned from the first task.

Instead, follow-up experiments showed that it was important to “have rapidly alternating sessions of training and sleep” while the AI was learning the second task, says Erik Delanois at the University of California, San Diego. This helped consolidate the connections from the first task that would have otherwise been forgotten.

Experiments showed how a spiking neural network trained in this way could enable an AI agent to learn two different foraging patterns in searching for simulated food particles while avoiding poisonous particles.

“Such a network will have the ability to combine consecutively learned knowledge in smart ways, and apply this learning to novel situations – just like animals and humans do,” says Hava Siegelmann at the University of Massachusetts Amherst.

Spiking neural networks, with their complex, biologically-inspired design, haven’t yet proven practical for widespread use because it’s difficult to train them, says Siegelmann. The next big steps for showing this method’s usefulness would require demonstrations with more complex tasks on the artificial neural networks commonly used by tech companies.

One advantage for spiking neural networks is that they are more energy-efficient than other neural networks. “I think over the next decade or so there will be kind of a big impetus for a transition to more spiking network technology instead,” says Ryan Golden at the University of California, San Diego. “It’s good to figure those things out early on.”

Journal reference: PLOS Computational Biology, DOI: 10.1371/journal.pcbi.1010628

More on these topics:

https://www.tausiinsider.com/ai-uses-artificial-sleep-to-learn-new-task-without-forgetting-the-last/?feed_id=440154&_unique_id=649e5fecd4c8c

Comments

Popular posts from this blog

Pick a Ceiling Fan Based on a Room's Square Footage

An Existentialist Guide to Feeling Nothing

6 of the Best Drinking Games to Play During Super Bowl LVII