The end of Dennard's scaling and Moore's Law and the deceleration of performance improvements of standard microprocessors are not problems to be solved, but facts that, recognized, offer amazing opportunities. We are seeing an explosion of novel computer architectures, which means exciting time in the academic work and industry. Inside of these architectures, we can find out quantum computing. Regarding this last paradigm, modern quantum theory, one of the great achievements of twentieth-century science, will soon celebrate its centenary. Estimations show that up to 40% of US GDP is due to technologies whose fundamental foundation is provided by that theory. It is expected that a discipline that celebrates its centenary will be mature and well examined, besides it could be said that it is one of the most robust and fertile scientific paradigms ever developed. Quantum computers have the capability to solve certain problems much faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their early stages, nowadays we can consider their scalability and reliability in the context of large-scale quantum computer design. To design such systems is a must to understand what is required to design and model a balanced and fault-tolerant quantum computer architecture. The goal of this part of the fellowship is to provide architectural abstractions for the understanding of a quantum computer and for exploring system-level challenges to achieve scalable, fault-tolerant quantum computing. The project will be based on developing several systems using quantum computing with potential applications in the domain of machine learning, communications, and computer security, among others.