AITopics

AITopics is the Internet's largest collection of information about the research, the people, and the applications of Artificial Intelligence. Our mission is to educate and inspire through a wide variety of curated and organized resources gathered from news, conferences and publications.

AITopics 3.0 is brought to you by The Association for the Advancement of Artificial Intelligence (AAAI) and is powered by AI technology from i2k Connect. The new engine combines machine learning with subject matter expert knowledge to automatically tag documents with unique, accurate and consistent metadata. Building on the familiar online shopping experience, you can discover and analyze just the information you need.

When you first arrive, you will see the latest information. As you add search terms and filters, the matching documents are updated.

How to focus

  • Keywords. Enter text in the search box.
  • Date. Choose a date in the date box or click the icon to enter a date range. You may also click the icon to see a histogram of items by publication year. Slide the circles along the axis to tighten the focus.
  • Theme. Select a collection of related sources (RSS feeds, Web sites, Blogs, Twitter feeds), or a file folder hierarchy from the i2k Connect reference library.
  • Concept Tag. The i2k Connect Platform identifies important words and phrases found in an item. Select from the Concept Tag menu to focus on items with a particular tag. You can also click on the icon to explore the tags associated with the resulting set of items.
  • Topic. The i2k Connect Platform classifies items into topics arranged in hierarchical taxonomies. Select from a menu with a icon to focus on a topic. Once you have selected a topic, you have the option to zoom in on Overviews, Instructional Materials, past weekly AI-Alerts, or Classics about the topic. Click News to return to the default display.
  • Source. The i2k Connect Platform scans content from hundreds of sources. Select from the Source menu to focus on one.
  • File Type. Web Page, PDF, Microsoft Word, ...

When you use this site on a mobile device, tap the icon at the top of the screen to show or hide the search box, and tap the icon to show or hide the left-hand-side menus.

Click on any page for help or display the complete user manual.

Search Result Display

Search results are shown in the right column on large displays and cover the screen on small displays.

The most recent items are displayed first. Click "Sort results by" to sort by Relevance or Title.

Click on "Show as Treemap" for a graphical display of the distribution of items. Click on any box to see more detail.

Information displayed for each item includes:

  • Title. For a Web page, the title is identified from the Web page metadata. For files like PDFs and Microsoft Office documents, the title is identified from the content of the file. Clicking on the title will open the item in a new browser tab.
    On a small-display device, such as a phone, click next to the title to see the additional metadata.
  • Source and Age. Where the item was published and how long ago. Click the icon to see detailed information about the item.
  • Summary. An auto-generated summary is displayed for each item. If you have entered words into the search box, a snippet of text is displayed with the words highlighted in context.
  •   Concept Tags. Important words or phrases found in the item. Click one to focus on items the include the same tag.
  • or or or Source. News, Twitter, file or web page. Click to focus on that source.
  • Topics. How the item is seen through the "lens" of each selected hierarchical View (taxonomy). A confidence factor is shown for each topic. There may be several topics or none. Click one to see all items identified for that topic.
  • Add feedback. Tell us about your overall impression of the new experience or about incorrect or missed topics, concept tags and other metadata for this item.
  • More like this. See items that have the same topics in the same highest-to-lowest confidence order.

Top Navigation Bar

  • Home. Return to the starting page and begin a new search.
  • Explore
    • Sources: The number of items associated with each news source (e.g., BBC News), Twitter feed (e.g., @cleantechgroup), and i2k Connect reference library collection (e.g., IAAI [Innovative Applications of Artificial Intelligence] proceedings).
    • Concept Tags. Important words or phrases found in all items known to the platform, sorted by how often they occur.
    • Experimental Trending. Clustering of recent content.
  • Views. Items are classified through the lenses of several hierarchical Views (taxonomies). Use the selector to decide which to display with your search results and in the filter selection menu. The selected Views will stay with your profile until you change them. The available Views are described below.

Search for items that contain all of the words you enter, or use this advanced search syntax.

Exact phrase "This That"
Boolean operators   This AND That ( or This That )
This OR That
This NOT That
Fields title:"Coalbed Methane"
pagetext:environmental (Shoe Running Glycerin Neutral Island Diva Womens Black Cushion 15 Pink Max Brooks Blue item contents)
filepath:JPT (string contained in the URL)
concept-tags:"drilling technology"
store:"health news" (e.g., name of news feed)
summary:"social responsibility"
views:"Technology|Information Technology|Artificial Intelligence" (a topic in one of the selected taxonomy lenses)
Wildcards title:Coal* ( Find items with words in the title that start with "Coal" )
Proximity "crude train"~3 ( Find items where 'train' and 'crude' occur within 3 words of each other; e.g., "train carrying crude". )

Available Views

The i2k Connect Platform classifies items into topics arranged in hierarchical taxonomies called Views.

  1. Island Running Shoe Diva Brooks Cushion 15 Black Womens Pink Max Neutral Blue Glycerin Genre. Forms of business and technical writing; e.g., earnings press release, overview, best practice.
  2. Industry. Industrial sectors (e.g., Oil & Gas, Freight & Logistics Services), drawn from a simplified version of the Global Industry Classification Standard (GICS).
  3. Technology. Information technologies, with an emphasis on Artificial Intelligence.


Five Functions of the Brain that are Inspiring AI Research

#artificialintelligence

The brain has always been considered the main inspiration for the field of artificial intelligence(AI). For many AI researchers, the ultimate goal of AI is to emulate the capabilities of the brain. That seems like a nice statement but its an incredibly daunting task considering that neuroscientist are still struggling trying to understand the cognitive mechanism that power the magic of our brains. Despite the challenges, more regularly we are seeing AI research and implementation algorithms that are inspired by specific cognition mechanisms in the human brain and that have been producing incredibly promising results. Recently, the DeepMind team published a paper about neuroscience-inspired AI that summarizes the circle of influence between AI and neuroscience research.


Estimating scale-invariant future in continuous time

arXiv.org Artificial Intelligence

Natural learners must compute an estimate of future outcomes that follow from a stimulus in continuous time. Critically, the learner cannot in general know a priori the relevant time scale over which meaningful relationships will be observed. Widely used reinforcement learning algorithms discretize continuous time and use the Bellman equation to estimate exponentially-discounted future reward. However, exponential discounting introduces a time scale to the computation of value. Scaling is a serious problem in continuous time: efficient learning with scaled algorithms requires prior knowledge of the relevant scale. That is, with scaled algorithms one must know at least part of the solution to a problem prior to attempting a solution. We present a computational mechanism, developed based on work in psychology and neuroscience, for computing a scale-invariant timeline of future events. This mechanism efficiently computes a model for future time on a logarithmically-compressed scale, and can be used to generate a scale-invariant power-law-discounted estimate of expected future reward. Moreover, the representation of future time retains information about what will happen when, enabling flexible decision making based on future events. The entire timeline can be constructed in a single parallel operation.

322 Home Ibiza Blue Blue Romika Home Romika Ibiza 322 Romika qBaw76F

Cushion Black Womens Blue Brooks Running Shoe 15 Max Neutral Island Pink Glycerin Diva

Deep Episodic Memory: Encoding, Recalling, and Predicting Episodic Experiences for Robot Action Execution

arXiv.org Artificial Intelligence

We present a novel deep neural network architecture for representing robot experiences in an episodic-like memory which facilitates encoding, recalling, and predicting action experiences. Our proposed unsupervised deep episodic memory model 1) encodes observed actions in a latent vector space and, based on this latent encoding, 2) infers most similar episodes previously experienced, 3) reconstructs original episodes, and 4) predicts future frames in an end-to-end fashion. Results show that conceptually similar actions are mapped into the same region of the latent vector space. Based on these results, we introduce an action matching and retrieval mechanism, benchmark its performance on two large-scale action datasets, 20BN-something-something and ActivityNet and evaluate its generalization capability in a real-world scenario on a humanoid robot.


Young kids are surprisingly bad at using memory to plan ahead

New Scientist

We used to think that planning for the future was a skill most children have by the age of four, but now it seems that we don't develop the kind of memory needed to do this until we're older. Episodic memory lets us reflect on our past, and imagine ourselves in the future. To find out when children develop this, Amanda Seed at the University of St Andrews in the UK and her colleagues devised a test for 212 children between the ages of three and seven. Each child was taught how to use a box that released a desirable sticker when the correct token was placed in it. An examiner showed them two boxes of different colours and told them that one would remain on a table while they left the room, and the other would be put away.


Embedding Models for Episodic Memory

arXiv.org Artificial Intelligence

In recent years a number of large-scale triple-oriented knowledge graphs have been generated and various models have been proposed to perform learning in those graphs. Most knowledge graphs are static and reflect the world in its current state. In reality, of course, the state of the world is changing: a healthy person becomes diagnosed with a disease and a new president is inaugurated. In this paper, we extend models for static knowledge graphs to temporal knowledge graphs. This enables us to store episodic data and to generalize to new facts (inductive learning). We generalize leading learning models for static knowledge graphs (i.e., Tucker, RESCAL, HolE, ComplEx, DistMult) to temporal knowledge graphs. In particular, we introduce a new tensor model, ConT, with superior generalization performance. The performances of all proposed models are analyzed on two different datasets: the Global Database of Events, Language, and Tone (GDELT) and the database for Integrated Conflict Early Warning System (ICEWS). We argue that temporal knowledge graph embeddings might be models also for cognitive episodic memory (facts we remember and can recollect) and that a semantic memory (current facts we know) can be generated from episodic memory by a marginalization operation. We validate this episodic-to-semantic projection hypothesis with the ICEWS dataset.

Women's Shorty Natural Sanuk TX Chambray Flat OfddHq

Amanuensis: The Programmer's Apprentice

arXiv.org Artificial Intelligence

This document provides an overview of the material covered in a course taught at Stanford in the spring quarter of 2018. The course draws upon insight from cognitive and systems neuroscience to implement hybrid connectionist and symbolic reasoning systems that leverage and extend the state of the art in machine learning by integrating human and machine intelligence. As a concrete example we focus on digital assistants that learn from continuous dialog with an expert software engineer while providing initial value as powerful analytical, computational and mathematical savants. Over time these savants learn cognitive strategies (domain-relevant problem solving skills) and develop intuitions (heuristics and the experience necessary for applying them) by learning from their expert associates. By doing so these savants elevate their innate analytical skills allowing them to partner on an equal footing as versatile collaborators - effectively serving as cognitive extensions and digital prostheses, thereby amplifying and emulating their human partner's conceptually-flexible thinking patterns and enabling improved access to and control over powerful computing resources.


Integrating Episodic Memory into a Reinforcement Learning Agent using Reservoir Sampling

arXiv.org Machine Learning

Episodic memory is a psychology term which refers to the ability to recall specific events from the past. We suggest one advantage of this particular type of memory is the ability to easily assign credit to a specific state when remembered information is found to be useful. Inspired by this idea, and the increasing popularity of external memory mechanisms to handle long-term dependencies in deep learning systems, we propose a novel algorithm which uses a reservoir sampling procedure to maintain an external memory consisting of a fixed number of past states. The algorithm allows a deep reinforcement learning agent to learn online to preferentially remember those states which are found to be useful to recall later on. Critically this method allows for efficient online computation of gradient estimates with respect to the write process of the external memory. Thus unlike most prior mechanisms for external memory it is feasible to use in an online reinforcement learning setting.


Lifelong Learning of Spatiotemporal Representations with Dual-Memory Recurrent Self-Organization

Cushion 15 Diva Running Shoe Island Blue Womens Glycerin Brooks Black Pink Neutral Max arXiv.org Artificial Intelligence

Humans excel at continually acquiring and fine-tuning knowledge over sustained time spans. This ability, typically referred to as lifelong learning, is crucial for artificial agents interacting in real-world, dynamic environments where i) the number of tasks to be learned is not pre-defined, ii) training samples become progressively available over time, and iii) annotated samples may be very sparse. In this paper, we propose a dual-memory self-organizing system that learns spatiotemporal representations from videos. The architecture draws inspiration from the interplay of the hippocampal and neocortical systems in the mammalian brain argued to mediate the complementary tasks of quickly integrating specific experiences, i.e., episodic memory (EM), and slowly learning generalities from episodic events, i.e., semantic memory (SM). The complementary memories are modeled as recurrent self-organizing neural networks: The EM quickly adapts to incoming novel sensory observations via competitive Hebbian Learning, whereas the SM progressively learns compact representations by using task-relevant signals to regulate intrinsic levels of neurogenesis and neuroplasticity. For the consolidation of knowledge, trajectories of neural reactivations are periodically replayed to both networks. We analyze and evaluate the performance of our approach with the CORe50 benchmark dataset for continuous object recognition from videos. We show that the proposed approach significantly outperforms current (supervised) methods of lifelong learning in three different incremental learning scenarios, and that due to the unsupervised nature of neural network self-organization, our approach can be used in scenarios where sample annotations are sparse.


Been There, Done That: Meta-Learning with Episodic Recall

arXiv.org Artificial Intelligence

Meta-learning agents excel at rapidly learning new tasks from open-ended task distributions; yet, they forget what they learn about each task as soon as the next begins. When tasks reoccur - as they do in natural environments - metalearning agents must explore again instead of immediately exploiting previously discovered solutions. We propose a formalism for generating open-ended yet repetitious environments, then develop a meta-learning architecture for solving these environments. This architecture melds the standard LSTM working memory with a differentiable neural episodic memory. We explore the capabilities of agents with this episodic LSTM in five meta-learning environments with reoccurring tasks, ranging from bandits to navigation and stochastic sequential decision problems.


Episodic Memory Deep Q-Networks

arXiv.org Artificial Intelligence

Reinforcement learning (RL) algorithms have made huge progress in recent years by leveraging the power of deep neural networks (DNN). Despite the success, deep RL algorithms are known to be sample inefficient, often requiring many rounds of interaction with the environments to obtain satisfactory performance. Recently, episodic memory based RL has attracted attention due to its ability to latch on good actions quickly. In this paper, we present a simple yet effective biologically inspired RL algorithm called Episodic Memory Deep Q-Networks (EMDQN), which leverages episodic memory to supervise an agent during training. Experiments show that our proposed method can lead to better sample efficiency and is more likely to find good policies. It only requires 1/5 of the interactions of DQN to achieve many state-of-the-art performances on Atari games, significantly outperforming regular DQN and other episodic memory based RL algorithms.

Black Max Diva Shoe 15 Blue Running Cushion Brooks Neutral Womens Pink Island Glycerin Uw4qWcxSA Black Max Diva Shoe 15 Blue Running Cushion Brooks Neutral Womens Pink Island Glycerin Uw4qWcxSA Black Max Diva Shoe 15 Blue Running Cushion Brooks Neutral Womens Pink Island Glycerin Uw4qWcxSA Black Max Diva Shoe 15 Blue Running Cushion Brooks Neutral Womens Pink Island Glycerin Uw4qWcxSA Black Max Diva Shoe 15 Blue Running Cushion Brooks Neutral Womens Pink Island Glycerin Uw4qWcxSA Black Max Diva Shoe 15 Blue Running Cushion Brooks Neutral Womens Pink Island Glycerin Uw4qWcxSA