key: cord-0587052-ep1hqtfr authors: Akrobotu, Prosper D.; James, Tamsin E.; Negre, Christian F. A.; Mniszewski, Susan M. title: A QUBO Formulation for Eigencentrality date: 2021-05-01 journal: nan DOI: nan sha: 231e1509064d35bccfb8fd6597cb7f2cde8976e3 doc_id: 587052 cord_uid: ep1hqtfr The efficient calculation of the centrality or"hierarchy"of nodes in a network has gained great relevance in recent years due to the generation of large amounts of data. The eigenvector centrality is quickly becoming a good metric for centrality due to both its simplicity and fidelity. In this work we lay the foundations for the calculation of eigenvector centrality using quantum computational paradigms such as quantum annealing and gate-based quantum computing. The problem is reformulated as a quadratic unconstrained binary optimization (QUBO) that can be solved on both quantum architectures. The results focus on correctly identifying a given number of the most important nodes in numerous networks given by our QUBO formulation of eigenvector centrality on both the D-Wave and IBM quantum computers. There are several centrality measures used to identify the most influential node or nodes within a network, each having their own benefits dependent on the data at hand or the results desired. For example, degree centrality [1] , which is based purely on the number of connections a node has, could be used for identifying the most popular person within a group of people on a social media platform (number of followers). Closeness centrality [2] is dependent on the length of the paths from one node to all other nodes in a network, prioritizing nodes that are "closer" to all other nodes as more central. This has been used for predicting enzyme catalytic residues from topological descriptions of protein structures [3] . Betweenness centrality [4] , is based on the number of times a node appears when two other nodes are connected by their shortest path. This measure is often used in biological networks, for example identifying a specific protein that is important for information flow within a network, which could be used in drug discovery [5] . Katz centrality [6] , measures the importance of a node through its immediate connections, and also the connections of other nodes through the immediate neighbors. Katz centrality has also been used within a biological setting, such as identifying disease genes [7] . PageRank centrality [8] is a variant of eigenvector centrality designed to rank web pages by importance based on links between pages or articles. This differs from eigenvector centrality as it takes into account directions between nodes (clicking from one web page to another). It is worth noting at this point that the rankings generated by these different centrality measures are correlated, especially for the most highly ranked nodes [9] [10] [11] [12] [13] . Our research is concentrated on a study of the eigenvector centrality (EC) measure [14] , and using this to determine the most important nodes in a given network. EC has been applied in many different fields of science. For example, identifying the most important amino acid residues in proteins undergoing an allosteric mechanism [15] and predicting flow-paths in porous materials [16] . It is also relevant to current world issues related to the COVID-19 pandemic, such as identifying people deemed as "super-spreaders" and areas that are hot-spots in a pandemic (network of people) [17] , and also in the analysis of production chains in the financial market, where micro-sectors are identified as important nodes within a chain [18] . EC is a centrality measure which assigns to each node a value that is proportional to the sum of the values for the node's neighbors [15, 19] . With this measure, the most influential node is a node that is connected to a majority of the other important nodes in the network. The scheme designed to implement the EC measure starts by obtaining the network's adjacency matrix, a matrix that describes the network's connectivity and is defined as follows: Let A = [a ij ] n×n be the adjacency matrix of a network or graph G = (V, E) of n nodes with entries defined by where E is the set of edges and V is the set of nodes or vertices. The Perron-Frobenious theorem [20] states that there is a unique largest real eigenvalue for a non-negative square matrix (here given by the adjacency matrix), with an eigenvector solution consisting of positive elements. So let the largest eigenvalue of A be λ 1 . Then the EC measure assigns to each node the value where x i is the given centrality value for the i th node. It is represented in matrix form by The degree centrality can also be defined as the count of the number of walks of length one that reach the node for which centrality is being computed. EC instead, is a count of the number of walks of infinite length [9, 14] . A brief demonstration of these concepts are shown in the Appendix with details also found in [10] . Due to the number of uses of centrality measures, it appears to be a good step forward to reformulate these problems for use on quantum computers. As networks become larger and more complicated, such as in drug discovery, the use of quantum computers could prove to be of great help in the future. Our research seeks to reformulate the iterative scheme for the EC problem as a classical optimization problem and encode it as a quadratic unconstrained binary optimization (QUBO) problem for quantum computers. The steady progress in the field of quantum computing since its proposal in the early 1980s by Richard Feynman has seen researchers trying different directions to circumvent the complexity of constructing portable physical quantum computers. Currently, there are two major approaches for building quantum computers: gate-based and quantum annealing. The gate-based quantum computers are designed using quantum circuits Identifying the importance of a node in a network based on (a) degree centrality and (b) eigenvector centrality. The color and radius of each disk around a node is dependent upon the centrality values. The least central nodes are colored in purple, the mid-central nodes are colored in green with the most central nodes colored yellow. Nodes 1, 2 and 3 have the highest values when using the degree centrality measure while node 0 has the highest value when using the EC measure. with control and manipulative power over the evolution of quantum states to tackle general problems arising in nature [21, 22] . The quantum annealing approach, however, uses the natural evolution of quantum states to tackle specific problems such as probabilistic sampling and combinatorial optimization problems [22] . The D-Wave quantum computer is a quantum computing platform that uses quantum annealing (QA), a heuristic search method that makes use of quantum tunneling and quantum entanglement to solve the ground state of the Ising model equivalence of combinatorial optimization problems. That is, any problem to be solved by the D-Wave quantum annealer, is to be modeled as a search for the minimum energy of the Ising Hamiltonian energy function as follows. where s i ∈ {−1, 1} are magnetic spin variables subject to local fields h i and nearest neighbor interactions with coupling strength J ij or its Boolean equivalence obtained from the transformation s = 2x − 1 where the entries of the vector x represent the binary variables x i ∈ {0, 1} and 1 is a vector of ones. The Boolean equivalence of the Ising problem is referred to as a QUBO problem, with the following equation D-Wave quantum annealers include the current 2000Q and the new Advantage [23] . The quantum processing unit (QPU) of the D-Wave 2000Q has up to 2048 qubits and 6061 couplers sparsely connected as a Chimera graph C 16 [23] . While the Advantage consists of more than 5000 superconducting qubits connected with 35,000 couplers on a Pegasus graph [23] . The sparse connectivity of the chimera graph of the D-Wave 2000Q requires a "minor embedding" of the Ising model connectivity of the problem onto the hardware. This results in chains of physical qubits representing logical qubits leading to a maximum capacity of 64 fully connected logical qubits/variables [24] . The D-Wave quantum computers possess the ability to sample degenerate ground state solutions and have been utilized in solving several problems such as quantum isomer search [25] , graph partitioning [26] , community detection [27] , binary clustering [28] , graph isomorphism [29] and machine learning [30, 31] . It has also been used in solving physical problems related to atomistic configuration stability [32] , job-shop scheduling [33] , airport and air traffic management [34, 35] . The gate-based quantum computers use unitary operations defined on a quantum circuit to transform input data into a desired output data [21, 36] . This computational mechanism is employed in the design of IBM quantum computers which are made available through a cloud-based platform called IBM Quantum (IBM-Q) Experience [37] . The unitary operations are designed to process data with high fidelity and to tackle both combinatorial optimization problems and non-combinatorial problems like prime factorization [21] . IBM-Q consists of both quantum hardware and simulators. The basic steps to follow in carrying out any experiment on the IBM-Q Experience requires first to specify a quantum circuit via a graphic interface called composer or a text-based editor (the cloud version is called quantum lab), then run the circuit on a simulator to verify specifications, and finally execute the circuit on the quantum processor for a number N of shots with N = 8192 being the maximum allowed on current devices [36] . In this paper, our main goal is to reformulate the EC problem as a QUBO problem that correctly identifies a desired number of the most important nodes of an undirected graph when solved on quantum annealing and gate-based quantum computers. As a secondary objective, we probe the possibility of using our formulation to generate a ranking of the nodes that correspond to the rankings obtained via the usual iterative method of solving the EC problem. The paper is organized as follows. Section 2 elaborates on our extraction of an optimization problem out of the EC problem classically. We then construct a QUBO formulation for identifying a number of the most important nodes of the graph by the EC measure to be implemented on quantum computing devices. Section 3 presents the software tools, implementations and results obtained from the classical and quantum computers such as D-Wave 2000Q and IBM-Q. It also stipulates ways of defining a rank from the results obtained from solving the QUBO on D-Wave 2000Q and provides discussions on the conclusions derived from the results. In section 4, we provide a summary of the results and conclude with suggestions of future directions to be considered. In this section we present the mathematical formulation of the EC problem as an optimization problem. A careful examination of the scheme in Eq. (1) and Eq. (2) shows that the EC is a problem of determining the eigenvector corresponding to the leading eigenvalue of the adjacency matrix of the network. To this end, we recall some useful properties for the leading eigenvalue of a symmetric matrix. where d max is the maximum degree of the graph. Hence it is understood that the maximum of the set of numbers { x T Ax x T x } coincides with the leading eigenvalue of the adjacency matrix and thus one can construct a maximization problem from this property as follows: This maximization problem is a constrained optimization problem which is equivalent to the following unconstrained minimization problem: where P is the Lagrange multiplier or simply a penalty constant and we have used the fact that 1 = The goal here is that the argument of a typical solution to the unconstrained minimization problem should preserve the ranking on the nodes in the network when the usual iterative scheme Eq. (2) is used in determining the importance of a node. That is, we are more interested in the rank assigned than the centrality values assigned to each node in the network. Recall that EC measures the influence of a node in a network. We therefore seek a QUBO problem whose solution classifies the most influential nodes in a network when the EC measure is used. That is to say, the proposed QUBO should directly determine the most influential nodes that would have been identified from the eigvenvector in Eq. (1). The word "binary" suggests that our search domain must be a binary field, however the original problem requires a search space of real vector spaces. Thus to efficiently transition into binary fields, we will consider simply splitting the set of nodes into two categories: most central and least central, where a value of 1 denotes a node is most central and a value of 0 denotes a node is least central. To this end, we define τ as the number of most central nodes we wish to be identified from the set of nodes (i.e. how many nodes are to be assigned the value of 1). The problem of splitting into two categories is binary as we only have two categories: 0 and 1, or high and low. The value τ therefore must be chosen with consideration of factors such as the size of the network and how many important nodes you wish to identify. This definition will require a slight modification to our unconstrained minimization model, Eq. (6) and a need for more constraints. Now, consider the second term in Eq. (6), since the search field is now binary, we have for each i, x 2 i = x i and based on the definition of τ , n i=1 x i = τ , we adapt the following modification to the penalty term. We now write the modified penalty term in matrix notation. where we have used the fact that x i = x 2 i and C = (1 − 2τ )I + U , I is the n × n identity matrix and U = [u ij ] is an n × n matrix with entries u ij defined by October 19, 2021 5/18 We now attempt to build a problem Hamiltonian from the eigenvector equation, Eq. (2). Note that where we have used A T = A for undirected graphs. We wish to obtain the ground state of Eq. (10) and determine whether there is any meaningful information leading to the identification of the most influential node of the graph. Our investigations showed no conclusive information embedded in the ground state for identifying the most central node. However, conclusive information could be draw from the first excited state. This then suggested a need to modify the above objective function Eq. (10) . Therefore motivated by the fact that EC is also a measure of walks of infinite length, we proposed a modified objective function whose symmetric matrix is defined by Eq. (11) . This form was inspired by the form of the quantum search Hamiltonian of Childs and Goldstone's continuous time quantum walk algorithm described in [38] . where P 0 , P 1 are penalty constants such that P 1 > P 0 , and Here the vectors e i ∈ R n are the canonical basis vectors of R n . We minimize over all the state vectors of the nodes, x ∈ {0, 1} n , of the graph G and verify using experimental results on several graphs the proposition: Proposition 1 Let G = (V, E) be an undirected graph with n = |V | nodes, adjacency matrix A and degree sequence d = (d 1 , d 2 , . . . , d n ) ∈ R n . Using the EC measure, the most central node of the graph G is the ground state of the QUBO problem. Using [45] and also on IBM Q using IBM Qiskit QASM simulator or real quantum devices available on the IBM Q, in particular ibmq manhattan [42] . At the front-end of the D-Wave platform, we use D-Wave Ocean tools to submit instructions for the optimal ground state solution for the problem Hamiltonian/QUBO with specified parameters such as the anneal time, chain strength, post-processing method and the number of samples to be collected. The front-end then sends the instructions to the 2000Q LANL solver chip for processing. Once the problem Hamiltonian is successfully embedded unto the chip, the annealer solves the QUBO for the minimum energy solution which is a bit string that minimally violates the constraint. Note that the bit string returned has τ number of 1's whose corresponding index denote the the top τ influential nodes of the graph. To solve the QUBO problem on the IBM-Q Experience platform, we employed Qiskit's CPLEX tools [46] in generating a quadratic program that is converted to a QUBO/Ising operator for building quantum instances on available QASM simulators or real quantum devices available on the IBM Q. The ground state of the QUBO Hamiltonian is then solved using a Minimum Eigen Solver [47] such as quantum approximate optimization algorithm (QAOA) [48] . Due to qubit limitation, the QUBO can be implemented for graphs with at most 65 nodes on IBM-Q's Manhattan which can encode at most 65 qubits. Our investigation considered fabricated and famous graphs such as shown in S1 Fig a) , Barbell graph -two complete graphs joined together by a path graph, (e.g., the Lollipop graph, G 7 (see S2 Fig f) ), Tutte graph G 3 (see S2 Fig c) -a cubic polyhedral graph with 46 nodes and 69 edges. The fabricated graphs were mostly tree graphs-connected acyclic undirected graphs such as graphs G 1 (see S2 Fig a) , and G 8 (see S2 Fig b) . The tree graphs G 1 and G 8 were fabricated mainly to test and show that the QUBO formulation is correctly executing EC rather than a degree centrality measure. NetworkX was used to obtain initial EC measures and node rankings for comparison with results from the quantum computations on small graphs. We first analyzed Eq. (6), using classical optimization tools from SciPy [40]. The analysis is aimed at verifying the possibility of constructing an unconstrained optimization problem whose solution determines the EC values of a graph exactly or within a small margin of error and concurrently preserves the node rankings of the graphs before we went on to formulate the quantum version. Classical minimization using this equation preserved the ranking of the nodes within a small error (see Table 2 ). This is of small concern however as these nodes have The first 4 columns describe some basic graph features such as the name of the graph, number of nodes |V |, number of edges |E| and the density of the graph ρ = extremely similar values. Therefore, the results obtained using classical minimization in Eq. (6) were in support of our hypothesis with the worst case showing the invariance of the rankings in the top 25% of the most central nodes. With these results, we moved forward to the construction of a quantum minimization problem: min x∈{0,1} n x T Qx with the objective of identifying the top τ most important nodes within our graphs, as confirmed by the results from NetworkX and the classical minimization mentioned previously. Note that here, the goal is to capture the top τ most important nodes of the graph irrespective of their order. Experimenting with the QUBO Eq. (11) on both the D-Wave 2000Q and IBM-Q devices (QASM simulator and ibmq manhattan for the Karate club graph), we obtained results showing that the formulation indeed captures EC and not degree centrality ranking when compared with the NetworkX output. Fig. 2b shows the output for a search for the most central node (colored yellow) which in this case requires τ = 1 of the The result was obtained using the SciPy minimizer with the Nelder-Mead method in solving Eq.(6). The column titled "Rank", defines the rank on each node in the columns titled "Nodes" using the centrality values obtained from both methods. Fig. 2a , the quantum computing scheme supports proposition 1, the claim that the ground state of the QUBO encapsulates the information about the nodes with high ranked EC values and not those of high ranked degree centrality values. Fig. 2c , shows the result obtained for a search for the top τ = 5 most important nodes (in yellow) of the graph G 8 . For the NetworkX results, the most central nodes are identified by the size and brightness of the color of the disk around the nodes; the larger and brighter the disk the more central the node. For the graph G 8 , the yellow node (0) is the most central, followed by nodes 1, 2, 3 and the nodes 4, 5, 6, . . . , 14 are all of the same centrality value and are the least central nodes. It was observed that it is sufficient to choose the penalty constants P 0 = 1 √ n , and P 1 > P 0 (in our case P 1 = 5n worked well). To obtain optimal results using the D-Wave 2000Q, it is best to set the chain strength to the maximum possible, 1000 in this case. It was observed that very low chain strength values resulted in broken chains affecting the probability of the QA settling into a global minimum solution. Setting the post-processing method to "optimization" and the number of samples to the maximum, 10, 000 boosted the chances of obtaining the optimal solution. However as the graph gets larger more samples are required to increase the probability of observing the ground state solution. Here is where some inconsistencies in obtaining the lowest energy output for larger graphs was highlighted: the first (or second, etc.) run may not result in the expected output. Running multiple times would eventually result in the correct output occurring once, but due to the noisy and quantum nature of the quantum machine, we do not know the exact number of runs required to give the global minimum solution. This was mostly observed in the Karate Club graph in Fig. 3e and the Davis Southern women network. With a little bit of luck the result can be obtained in the first run or in a few runs. This behavior is not surprising since an increase in the problem size decreases the probability of finding an optimal solution due to annealing error and imperfect hardware [51] . The quadratic solver QBSolv in Ocean served as a benchmark for determining the correct minimum of the D-Wave annealing output. QBSolv provided expected results of the minimization on most occasions with degenerate ground state solutions in some instances. Degeneracy here refers to the outputs with the same minimum energy value of the QUBO due to multiple nodes having the same centrality value. Whenever there is degeneracy in the QBSolv output, one of the solutions correctly identifies the top τ nodes. For example, in Fig. 2c , we have 12 degenerate solutions corresponding to the 12 leaves of the tree graph. The solution graphed included node 14 in the top 5 important nodes, however node 4, 5, 6, 7, 8, 9, 10, 11, 12, 13 or 15 are valid replacements for node 14 in this output since column 3 of Table 2 shows that all these nodes have the same centrality values. Solving the problem for the QUBO in Eq. (11) for the graph in Fig. 1 and graph G 8 in S2 Fig correctly identify the most important nodes for any given τ , including selecting node 0 as the node with the highest EC (see Fig. 2 ). The penalty weights used here were P 0 = 1 √ n and P 1 = 5n for n number of nodes of the graphs. The same results were obtained from experiments on IBM-Q's QASM simulator using QAOA. However the ground state solution for the graphs with smaller numbers of nodes (n ≤ 16) were obtained on a single run with the penalty constants P 0 = 1 √ n and P 1 = 5n. When considering graphs with larger numbers of nodes, the program had to be run multiple times using the same penalty constants above to be able to capture the global minimum solution. Solving for the ground state solution of the QUBO in Eq. (11) for the graph G 1 and Karate Club graph in S2 Fig was quite challenging. On most occasions, the solution for the Karate Club graph with τ ≥ 3 required multiple runs before settling on the global minimum solution. In other words, this graph required more samples to be able to output the global minimum. For 3 ≤ τ ≤ 5, the QUBO couldn't capture the ordering that matched that of NetworkX rankings when using the penalty constants P 0 = 1 √ n and P 1 = 5n for the graph G 1 . The nodes 0, 3, 4 were always skipped in the search for the top τ = 3, 4, 5. From Fig. 3d , we see that the output for the top 6 most important nodes excludes node 4 and includes node 2 which shouldn't be the case since from Fig. 3a , node 4 is more central than node 2. The exact reason for this occurrence for this particular graph is unknown, however it seems the program picks only one of the degenerate nodes 3, or 4 and moves on to select the next central node 2. With these interesting results, we further examined the possibility of defining a hierarchy of nodes in the network from the QUBO results obtained from the D-Wave 2000Q and IBM Q. By hierarchy we mean a ranking that orders the nodes based on Determining node rank from the QUBO result obtained from D-Wave (QA) and IBM-Q (QAOA) using symmetric difference of results for τ = 1 to τ = n. In column 4, the ith in rank is determined by taking the difference of the result for τ = i and τ = i − 1 e.g. the 1st (most central) node 2 is determined by implementing the QUBO for τ = 1, the 2nd is determined by taking the difference between the result of τ We have formulated and shown a QUBO problem for the identification of most important nodes or hot-spots in a graph based on the EC measure is possible. Using quantum computing algorithms such as quantum annealing on the D-Wave 2000Q and QAOA on IBM-Q, our formulation was able to correctly identify all τ < n most important nodes for graphs with less than 17 nodes (n < 17). For graphs with more than 16 nodes, the quantum computing algorithm always identified the top τ ≤ 6 most important nodes for all the graphs considered correctly except for the Davis Southern women network and the tree graph G 1 whose outputs for some values of 1 < τ ≤ 6 showed some marginal inconsistencies. Marginal because the differences are negligible. For example, for the graph G 1 , the node 3 was not selected since it's centrality value is the same as that of node 4. Despite this challenge, the results obtained from all graphs considered for τ = 1 verified Proposition 1. We have also demonstrated the feasibility of defining a hierarchy of nodes in a graph from our formulation using the QUBO results from the D-Wave 2000Q and IBM-Q's QASM simulator. Given that the current quantum resources (D-Wave 2000Q and IBM-Q's Manhattan) at our disposal limit us on the size of graphs to explore and in the presence of uncontrollable noise which affects the probabilities of obtaining quality results, we were unable to experiment with real life data. Therefore in the future when powerful and less noisy quantum computers are made available for our perusal, we wish to test our hypothesis further to establish a more generalized formulation that works for all τ on all graphs. That is, we want to verify Claim 1 For any graph G = (V, E), with adjacency matrix A and degree sequence D. The indices of the nonzero elements of the ground state solution to the QUBO problem where the matrix U is defined in Eq. (9), corresponds to the τ most central nodes via EC measure of the graph G for any τ ≤ n = |V |. For this claim we wish to investigate both directed and undirected graphs, "will replacing A 2 by AA T or A T A in Q still work for directed graphs or will it require a modified QUBO?" Supporting information S1 Appendix. Computing Degree Centrality and Eigencentrality from Exponential Function. Consider the exponential function defined by Since f is continuous and analytic with radius of convergence infinity, we observe that for a primitive matrix A which is mostly the case for the adjacency matrix of simply connected undirected graphs Then f (γA) is a matrix and the components of the vector f (γA)1 where 1 is a vector of ones, counts the walks of infinite length centered at each node. Let {e i ∈ R n } be the set of canonical basis vectors for R n whose only nonzero element is the ith component. where d i is the degree of the ith node. In the limit γ −→ 0 + , Eq. (15) converges to the degree centrality and by expanding in the eigenbasis, it converges to EC for γ −→ ∞ [10] . Centrality in social networks conceptual clarification. Social Networks The centrality index of a graph How accurate and statistically robust are catalytic site predictions based on closeness centrality? A Set of Measures of Centrality Based on Betweenness The importance of bottlenecks in protein networks: correlation with gene essentiality and expression dynamics A New Status Index Derived from Sociometric Index Ranking candidate disease genes from gene expression and protein interaction: a Katz-centrality based approach The anatomy of a large-scale hypertextual Web search engine. Computer Networks and ISDN Systems Total communicability as a centrality measure On the Limiting Behavior of Parameter-Dependent Network Centrality Measures Correlations among centrality indices and a class of uniquely ranked graphs Analyzing complex networks through correlations in centrality measurements On the Eigenvalue Power Law Power and Centrality: A Family of Measures Eigenvector centrality for characterization of protein allosteric pathways Eigenvector centrality for geometric and topological characterization of porous media Tracking the spread of COVID-19 in India via social networks in the early phase of the pandemic In and out lockdowns: Identifying the centrality of economic activities Quantum computing in a nutshell Welcome to D-Wave Minor-Embedding a Problem onto the QPU Quantum isomer search Graph Partitioning Using Quantum Annealing on the D-Wave System Detecting multiple communities using quantum annealing on the D-Wave system Ising Models for Binary Clustering via Adiabatic Quantum Computing QUBO formulations for the graph isomorphism problem and related problems Quantum adiabatic machine learning. Quantum Inf Process Application of Quantum Annealing to Training of Deep Neural Networks Vacancies in graphene: an application of adiabatic quantum optimization Quantum Annealing Implementation of Job-Shop Scheduling Flight Gate Assignment with a Quantum Annealer Quantum Annealing Applied to De-Conflicting Optimal Trajectories for Air Traffic Management Benchmarking gate-based quantum computers Quantum Experience is quantum on the cloud Spatial search by quantum walk s Ocean Software Exploring Network Structure, Dynamics, and Function using NetworkX A 2D graphics environment Los Alamos National Laboratory Upgrades to D-Wave Converters for Quadratic Programs Minimum Eigen Optimizer Graph generators Barabási-Albert model -Wikipedia, The Free Encyclopedia Quantum annealing correction for random Ising problems importance or influence using EC. The hierarchy can then be used to identify for example super-spreaders of disease. We compare the hierarchy of nodes obtained using our QUBO formulation with that obtained from NetworkX when using the EC algorithm. The result obtained for graph M is shown in Table 3 . To determine the node rank, we consider the set of τ nodes obtained from the QUBO results for each value of 0 < τ ≤ n and compute the symmetric difference. The ith rank is determined by taking the difference of the QUBO result for τ = i and τ = i − 1. For example, to rank the nodes for graph M , the 1st most important node is obtained by running the QUBO for τ = 1. The 2nd most important node is determined by taking the difference between the QUBO results for τ = 2,