As software companies grow, their codebases often become more complex, resulting in accumulated legacy code and technical debt. This situation becomes more challenging when team members - especially those well-versed in the codebase - leave the company. Newer team members may face difficulties understanding the code due to outdated or missing documentation. To overcome these…
Large Language Models (LLMs) have significantly impacted machine learning and natural language processing, with Transformer architecture being central to this progression. Nonetheless, LLMs have their share of challenges, notably dealing with lengthy sequences. Traditional attention mechanisms are known to increase the computational and memory costs quadratically in relation to sequence length, making processing long sequences…
NVIDIA is pushing boundaries in the world of AI and high-performance computing (HPC) with the launch of its Blackwell platform. Named after renowned mathematician, David Harold Blackwell, the platform introduces two innovative Graphics Processing Units (GPUs) – the B100 and the B200 – which promise to shake up AI and HPC with groundbreaking advancements.
The B100…
Large Vision Language Models (LVLMs) have shown excellent performance in tasks that require comprehension of both text and images, with progress in image-text understanding and reasoning becoming particularly noticeable in region-level tasks like Referring Expression Comprehension (REC). Notably, models like Griffon have demonstrated excellent performance in tasks such as object detection, indicating significant advances in…
This article details a recent Google study whose goal is to train Large Language Models (LLMs) to better process information represented in graph form. LLMs are typically trained on text, but graphs provide a more efficient way of organising information due to their visual representation of relationships between entities (nodes) as connected by links (edges).…
Contrary to the modern perception that philosophy is outdated and eclipsed by science, this article makes the case for philosophy's relevance in our contemporary world. The cynical view of philosophy, often seen as little more than an academic debate among old men, is both unjust and false. Philosophy plays an essential role in developing our…
Large Language Models (LLMs), such as ChatGPT or Google Gemini, react quickly and generate human-like responses due to an emerging domain known as LLMOps (Large Language Model Operations). The response times must resemble a natural conversation, treating every interaction as a dialog between human beings, which is the primary goal of LLMs made possible through…
Teaching AI agents new tasks can be a challenging and time-consuming process, often involving iteratively updating a reward function designed by a human expert to motivate the AI’s exploration of possible actions. However, researchers from the Massachusetts Institute of Technology, Harvard University, and the University of Washington have developed a new reinforcement learning approach that…
In a recent symposium titled "Generative AI: Shaping the Future", iRobot co-founder Rodney Brooks urged caution regarding the unbridled optimism around generative artificial intelligence (AI). Generative AI uses machine-learning models to generate new material similar to the data it has been trained on, and has proven capable of creative writing, translation, generating code, and creating…