I am looking for motivated and hardworking graduate and undergraduate students who wants to do research. It is almost not possible to identify the students who will be good researchers unless they already have papers published in international journals (this is one of the reasons why an undergraduate student needs to do research; the other is to see if you are good at it, and most importantly, if you will like it or not). Although they provide some clues, neither the GPA or any other test shows the true potential of a student. So if you have the motivation (this is important) and interested in the topics on this page or if you have interesting suggestions on the problems related to these areas let me know and we can work together.

BigData and High Performance Computing

Today, the data one needs to cope with for a major scientific innovation or discovery is immense, distributed, and unstructured. Thanks to the advancements on computer hardware and storage technologies, we currently have good arsenals to manage it. However, due to the size and complexity of the data, we need to combine these resources with scalable and cost-effective algorithms for its analysis. Not every hardware-software combination is effective and efficient enough to provide an insight that can make an impact on science, technology, and hence, on the public. And even with the best combination at hand, we may still suffer in the future since the data growth is exponential; around 90% of the data we have today is generated in the last two years. I am working on the BigData problems such as genome analysis, research(er) evaluation, network analysis and I try to re-engineer the algorithms tailored to specific cutting-edge hardware, e.g., multicore processors, GPUs, and accelerators such as Xeon Phi, to exploit the best features of each hardware configuration, and furthermore, employ a heterogeneous computing approach that simultaneously works with various hardware resources to obtain a much better performance.

Parallel Streaming Network Analysis

The entities and their interactions on today's graphs and networks vary with time. Hence, efficient parallel and incremental algorithms are necessary to perform a thorough analysis on dynamic graphs. Furthermore, for applications such as threat detection and biosurvelliance, timely countermeasures are crucial and a lack of real-time analysis might have a drastic impact on people’s lives.

Local network analysis such as counting the number of triangles or finding k-core and k-truss have been studied in an incremental context. Social researchers are also interested in non-local metrics for individual network entities and their rankings. Among such metrics, centrality has been developed to answer questions such as which node is more important in a network or how central it is for many different applications. Betweenness and closeness are common centrality metrics which are expensive to use even for static graphs. I previously worked on these metrics and developed incremental algorithms to identify the set of vertices that require a centrality update when the graph changes. There are many other interesting problems that arise from the changes in a network such as updating the impact of a research paper on a citation network when it is cited by the others or finding the weakest node/edge in a network in case of an attack.