Sort by
Refine Your Search
-
Category
-
Program
-
Field
-
infrastructure to keep pace with the precipitous increase in scale of state-of-the art AI. NDIF is an NSF-funded project to create an innovative, highly-transparent large-scale AI inference infrastructure
-
on disease processes and therapeutic frontiers The Center employs nearly 100 faculty members, staff, volunteers, PhD students, visiting scholars, and undergraduate students. These activities have been
-
of and ability to select, adapt, and effectively use large AI foundational models. Professional experience developing solutions using NLP, computer vision, or forecasting. Proficiency with statistical
-
Internet Observatory (nationalinternetobservatory.org), a large, NSF-funded project. The Data Engineer/Scientist provides expert support and guidance for data-intensive research activities within a
-
, cloud technologies, Blockchain, Big Data technologies/platforms, and other. This position is new and will requires leadership, initiative, innovation, problem-solving skills, as well as the ability
-
into a large lecture (ARTH 1001), run by a permanent faculty member and smaller recitation sections (ARTH 1002), run by the PTL. The lecture introduces students to key concepts and terms, which then get
-
instructor to teach a course titled “Film Music.” The successful candidate will teach a large-enrollment class that introduces students to the history of film music and ways of listening to music in film. A
-
, papers, and journal articles; and disseminate research results through publications, open source software, and open data. Qualifications: A recent PhD degree in civil, architectural, mechanical, electrical
-
: Advanced experience with distributed data systems such as Apache Spark, indicating a strong capability in handling and processing large-scale data essential for complex AI projects. Proven expertise in
-
. Preferred Experience: Advanced experience with distributed data systems such as Apache Spark, indicating a strong capability in handling and processing large-scale data essential for complex AI projects