Aprendiendo Arduino con mi familia en la Javeriana
Analítica de datos y machine learning en redes complejas
Selected Topics in Machine Learning
Optimization, Computer Vision and Networks
Machine learning (ML) has been the hottest topic in computer science in the past decade, so it has impacted the entire field. Increasing amounts of data generated by modern complex systems such as the energy grid, social media platforms, sensor networks, and cloud-based services call for attention to distributed data processing, in particular, for the design of scalable algorithms that take into account storage and communication constraints and help to make coordinated decisions. Furthermore, graphs show up in machine learning in many forms. Oftentimes, the input data can be naturally represented as a graph, such as for relational learning tasks applied to social networks and graph kernels applied to chemical data. Other times, graphs are just a framework to express some intrinsic structure in the data, such as for graphical models and non-linear embedding. In both cases, recent advances in representation learning (or graph embedding) and deep learning have generated a renewed interest in machine learning on graphs. Machine learning on graphs (MLG) is an exciting and growing research topic because many relevant real-world problems (in recommendation, infrastructure, healthcare, etc.) have some structure that can be captured as nodes, edges, and their attributes.
Introduce the audience to recent developments in computer vision, machine learning over networks and distributed optimization theory and applications to engineering.
- Show computer vision applications to healthcare systems and generative models.
- Explain basic concepts in network science and graph theory and its applications to machine learning over networks.
- Describe the distributed optimization problem and consensus-based methods for decentralized machine learning.
Senior undergraduate students in engineering, computer science and mathematics. Graduate students in engineering, computer science, and mathematics. Some prior knowledge on linear algebra, probability theory and real analysis would be recommended.
The speakers will provide a concise description of advanced developments in computer vision, machine learning over graphs and distributed optimization. We will focus on strong foundational understanding of concepts and its applications to modern engineering systems.
Day 1 (Jun 13)
Session 1 to 3 – Networks: (Arlei Silva)
- Network science
- Spectral graph theory
- Label propagation
- Graph neural networks
Session 4 - Lab: (Rafael Linero)
- Matlab and python: first approximation to machine learning tools
Day 2 (Jun 14)
Session 1: (Carlos Parra)
- From machine learning to deep learning
Session 2-4 - Distributed Optimization: (Cesar Uribe)
- Basics of optimization
- The consensus problem
- Distributed learning
Day 3 (Jun 15) - Computer Vision
Session 1: (Carlos Parra)
- Supervised and unsupervised learning
Session 2-4: (Guha Balakrishnan)
- Neural networks in vision, an overview
- Texture and style synthesis
- Generative image models
- Neural rendering
Day 4 (Jun 16) - Labs & Demos
- Image classification using transfer learning. Part I. (Carlos Parra and Rafael Linero)
- Networks (Arlei Silva)
- Distributed optimization (Cesar Uribe)
- Computer visión (Guha Balakrishnan)
Day 5 (Jun 17)
Session 1 - Lab: (Carlos Parra and Rafael Linero)
- Image classification using transfer learning. Part II.
Session 2-4 - ML Research Symposium:
(Cesar Uribe, Arlei Silva and Guha Balakrishnan)
- Invited Speakers Symposium
- Graduate school in the US
- Arlei Silva is an Assistant Professor of Computer Science at Rice University. His research focuses on developing algorithms and models for mining and learning from complex datasets, broadly defined as data science, especially for data represented as graphs/networks. He is
particularly interested in problems motivated by computational social science, infrastructure, and healthcare. To address these problems, he applies tools from machine learning, network science, graph theory, linear algebra, optimization, and statistics. He received a Ph.D. in Computer Science from the University of California, Santa Barbara, advised by Ambuj Singh, where he was also a postdoctoral scholar. Before that, he received a B.Sc and M.Sc degrees in Computer Science from Universidade Federal de Minas Gerais, in Brazil, advised by Wagner Meira Jr. He has also been a visiting scholar at the Rensselaer Polytechnic Institute, hosted by Mohammed J. Zaki.
- Cesar A. Uribe is the Louis Owen Jr. Assistant Professor at the Department of Electrical and Computer Engineering at Rice University. He received the M.Sc. degrees in systems and control from Delft University of Technology, in The Netherlands, and in applied mathematics from the University of Illinois at Urbana-Champaign, in 2013 and 2016, respectively. He also received the PhD degree in electrical and computer engineering at the University of Illinois at Urbana-Champaign in 2018. He was a Postdoctoral Associate in the Laboratory for Information and Decision Systems-LIDS at the Massachusetts Institute of Technology-MIT until 2020 and holds a visiting professor position at the Moscow Institute of Physics and Technology. His research interests include distributed learning and optimization, decentralized control, algorithm analysis, and computational optimal transport.
- Guha Balakrishnan grew up in the towns of East and South Brunswick, New Jersey. In 2011, I received my B.S. degrees in Computer Science Engineering and Computer Engineering from the University of Michigan, Ann Arbor. I then went back east to MIT, where I completed my M.S. in 2013 and Ph.D. in 2018 in CSAIL, under the supervision of John Guttag and Frédo Durand. After the PhD, I was a postdoctoral researcher in Bill Freeman’s group at MIT from 2018-2020 and a scientist in AWS from 2020-2021 working on fairness and accountability of AI systems.
Course intro: Monday 8:30 am – 9 am.
Session 1: 9 am - 10:30 am
Break (10:30 - 11:00)
Session 2: 11:00 am - 12:30 m
Lunch: 12:30 m - 2:30 pm
Session 3: 2:30 - 3:30 pm
Break (3:30 - 4:00)
Session 4: 4:00 - 5:30 pm
Sessions Tuesday to Friday:
Session 1: 8 am - 10:00 am
Break (10:00 - 10:30)
Session 2: 10:30 am - 12:00 m
Lunch: 12:00 m - 1:30 pm
Session 3: 1:30 - 3:00 pm
Break (3:00 - 3:30)
Session 4: 3:30 - 5:00 pm
4% por pronto pago en curso o diplomados, cancelando 30 días calendario previos a la fecha de inicio (acumulable con otros descuentos)
10% egresados, afiliados a Cafam
15% para grupos de 3 a 5 participantes en el mismo curso o diplomado
20% para grupos de 6 personas en adelante, y en el tercer curso o diplomado realizado consecutivamente
Forma de pago: Efectivo, cheque de gerencia, tarjeta de crédito (recibimos todas las tarjetas, cuenta de cobro).