Network Science

Theoretical Challenges in Network Science (ENAS 962)

Theoretical challenges in network science, as the title suggests, is about the (beautiful) mathematics behind network models. The course starts with the concept of “tipping points” or how little things can make a big difference in the formation of random networks. We then talk about preferential attachment model or the rich-get-richer phenomenon along with small world networks. We again see a different kind of tipping point on the way people can navigate such networks in a decentralized manner.

We then change topic and see we can model mathematically games. We discuss the concept of equilibrium and show under fairly general conditions, it always exist. We look at the price of anarchy or how much a society suffers if everyone acts selfishly.

We then learn about submodular functions and their role in network science. We see how such functions naturally arise in many interesting scenarios including sensor placement, influence maximization, etc.

The next concept we talk about in this course is Markov chains and in particular random walks on graphs. We look at a particular successful application, namely, pagerank (or how we naively think Google works). If time permits we also discuss the sampling process in large graphs and the way we can simulate this process through carefully designed random walks.

Near the end of the class, we look at epidemics and diffusion models. We see how an outbreak happens and how we can detect it. This topic has also very interesting ties to gossip algorithms and consensus mechanisms which we briefly discuss.

Time and Place: There will be two lectures per week - Tuesday and Thursday in WLH 114 at 1-2:15

Lecturer: Amin Karbasi (amin.karbasi@yale.edu), office hours: by appointment.

Grading: The grade will be based on (programming) problem sets (10%), scribe notes (10%), midterm (40%), and a final exam (40%). Class participation (e.g., interaction during the lecture, asking good questions) can also (positively) influence your grade.

Collaboration: In case of problem sets, collaboration with one other person is also allowed, but: (1) the writeup of all the solutions has to be prepared independently (in particular, you should understand and be able to explain everything that is written in your solution); (2) in your writeup, you should include the name of your collaborator.

Scribe notes: For some lectures, there will be one student in charge of taking detailed notes, typing them in LaTeX, preparing any needed figures, and sending them to the lecturer within one week after the lecture. These notes will be posted on the course website. When preparing scribe notes, please use this template, just edit it to include your notes (The template requires preamble.tex and fullpage.sty files to compile. Put them in the same directory). If you do not know LaTex, learn it in 157 minutes from here.

Problem sets: All solutions should be emailed to the lecturer as a PDF (typeset in LaTeX, not scanned) – for your convenience, there will be a template accompanying each problem set.

Lecture 0

Introduction, simple notions and facts.

Look at the slides.

Lectures 1 and 2

Tipping points in branching process.

Look at the scribe notes.

Lectures 3

Branching process: one-by-one exploration and tail bounds.

Look at the scribe notes.

Lectures 4

Random Graphs, Moment Methods.

Look at the scribe notes.

Lectures 5

Sharp and Weak Thresholds.

Look at the scribe notes.

Lectures 6

Tipping point for connectivity.

Look at the scribe notes.

Lectures 7 and 8

Subcritical and supercritical regimes.

Look at the scribe notes and Lin’s slides

Lectures 9 and 10

Small diameters in random graphs

Look at the scribe notes

Lecture 11

Power-law graphs

Look at the slides

Lecture 12, 13 and 14

It is practically impossible to write better than this paper

Look at the slides

Midterm

Enjoy the midterm

Lecture 15

coordination game

Look at the scribe note

Lecture 16

Markov Chain I

Look at the scribe note

Lecture 17

Markov Chain II

Look at the scribe note

Lecture 18

Voter model

Look at the scribe note

Lecture 19

Independent cascade model

Look at the scribe note

Final

Enjoy the Enjoy the final