{"id":2098,"date":"2014-09-14T13:25:25","date_gmt":"2014-09-14T12:25:25","guid":{"rendered":"http:\/\/www.blopig.com\/blog\/?p=2098"},"modified":"2014-09-14T12:52:39","modified_gmt":"2014-09-14T11:52:39","slug":"the-origins-of-exponential-random-graph-models","status":"publish","type":"post","link":"https:\/\/www.blopig.com\/blog\/2014\/09\/the-origins-of-exponential-random-graph-models\/","title":{"rendered":"The origins of exponential random graph models"},"content":{"rendered":"<p>The article\u00a0<em>An Exponential Family of Probability Distributions for Directed Graphs<\/em>, published by Holland and Leinhardt (1981), set the foundation for the now known exponential random graph models (ERGM) or <em>p<\/em>* models, which model jointly the whole adjacency matrix (or graph) <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=X&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"X\" class=\"latex\" \/>. In this article they proposed an exponential family of probability distributions to model <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=P%28X%3Dx%29&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"P(X=x)\" class=\"latex\" \/>, where <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=x&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"x\" class=\"latex\" \/> is a possible realisation of the random matrix <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=X&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"X\" class=\"latex\" \/>.<\/p>\n<p>The article is mainly focused on directed graphs (although the theory can be extended to undirected graphs). Two main <em>effects<\/em> or <em>patterns<\/em>\u00a0are considered in the article:<em>\u00a0<strong>Reciprocity<\/strong><\/em>, which relates to appearance of symmetric interactions (<img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=X_%7Bij%7D%3D1+%5Ciff+X_%7Bji%7D%3D1&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"X_{ij}=1 &#92;iff X_{ji}=1\" class=\"latex\" \/>) (see nodes 3-5 of the Figure below).<\/p>\n<p><img data-recalc-dims=\"1\" decoding=\"async\" loading=\"lazy\" class=\"wp-image-2104 aligncenter\" src=\"https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed.jpg?resize=263%2C263&#038;ssl=1\" alt=\"Stochastic_block_model_directed\" width=\"263\" height=\"263\" srcset=\"https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed.jpg?resize=300%2C300&amp;ssl=1 300w, https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed.jpg?resize=150%2C150&amp;ssl=1 150w, https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed.jpg?resize=624%2C624&amp;ssl=1 624w, https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed.jpg?w=900&amp;ssl=1 900w\" sizes=\"auto, (max-width: 263px) 100vw, 263px\" \/><\/p>\n<p>and, the\u00a0<strong><em>Differential<\/em><em> attractiveness<\/em><\/strong> of each node in the graph, which relates to the amount of interactions each\u00a0node &#8220;receives&#8221; (in-degree) and the amount of interactions that each node &#8220;produces&#8221; (out-degree) (the Figure below illustrates the differential attractiveness of two groups of nodes).<\/p>\n<p><a href=\"https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed2.jpg?ssl=1\"><img data-recalc-dims=\"1\" decoding=\"async\" loading=\"lazy\" class=\" wp-image-2105 aligncenter\" src=\"https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed2.jpg?resize=324%2C324&#038;ssl=1\" alt=\"Stochastic_block_model_directed2\" width=\"324\" height=\"324\" srcset=\"https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed2.jpg?resize=300%2C300&amp;ssl=1 300w, https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed2.jpg?resize=150%2C150&amp;ssl=1 150w, https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed2.jpg?resize=624%2C624&amp;ssl=1 624w, https:\/\/i0.wp.com\/www.blopig.com\/blog\/wp-content\/uploads\/2014\/09\/Stochastic_block_model_directed2.jpg?w=900&amp;ssl=1 900w\" sizes=\"auto, (max-width: 324px) 100vw, 324px\" \/><\/a>\u00a0The model of\u00a0Holland and Leinhardt (1981), called <em>p<\/em>1 model, that considers jointly the reciprocity of the graph and the differential attractiveness of each node is:<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=p_1%28x%29%3DP%28X%3Dx%29+%5Cpropto+e%5E%7B%5Crho+m+%2B+%5Ctheta+x_%7B%2A%2A%7D+%2B+%5Csum_i+%5Calpha_i+x_%7Bi%2A%7D+%2B+%5Csum_j+%5Cbeta_j+x_%7B%2Aj%7D%7D%2C+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"p_1(x)=P(X=x) &#92;propto e^{&#92;rho m + &#92;theta x_{**} + &#92;sum_i &#92;alpha_i x_{i*} + &#92;sum_j &#92;beta_j x_{*j}}, \" class=\"latex\" \/><\/p>\n<p>where <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Crho%2C%5Ctheta%2C%5Calpha_i%2C%5Cbeta_j+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;rho,&#92;theta,&#92;alpha_i,&#92;beta_j \" class=\"latex\" \/> are parameters, and <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Calpha_%2A%3D%5Cbeta_%2A%3D0+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;alpha_*=&#92;beta_*=0 \" class=\"latex\" \/> (identifying constrains).\u00a0\u00a0<img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Crho+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;rho \" class=\"latex\" \/> can be interpreted as the mean tendency of <strong>reciprocation<\/strong>,\u00a0<img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Ctheta&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;theta\" class=\"latex\" \/> can be interpreted as the <strong>density<\/strong> (size) of the network,\u00a0<img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Calpha_i+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;alpha_i \" class=\"latex\" \/> can be interpreted as as the <strong>productivity<\/strong> (out-degree) of a node,\u00a0<img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Cbeta_j+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;beta_j \" class=\"latex\" \/> can be interpreted as the <strong>attractiveness<\/strong> (in-degree) of a node.<\/p>\n<p>The values <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=m%2C+x_%7B%2A%2A%7D%2C+x_%7Bi%2A%7D&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"m, x_{**}, x_{i*}\" class=\"latex\" \/> and <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=x_%7B%2Aj%7D&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"x_{*j}\" class=\"latex\" \/> are: the number of reciprocated edges in the observed graph, the number of edges, the out-degree of node i and the in-degree of node j; respectively.<\/p>\n<p>Taking <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=D_%7Bij%7D%3D%28X_%7Bij%7D%2CX_%7Bji%7D%29&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"D_{ij}=(X_{ij},X_{ji})\" class=\"latex\" \/>, the model assumes that all\u00a0<img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=D_%7Bij%7D&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"D_{ij}\" class=\"latex\" \/> with <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=i%3Cj+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"i&lt;j \" class=\"latex\" \/> are independent.<\/p>\n<hr \/>\n<p>&nbsp;<\/p>\n<p>To better understand the model, let&#8217;s review its derivation:<\/p>\n<p>Consider the pairs <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=D_%7Bij%7D%3D%28X_%7Bi%2Cj%7D%2CX_%7Bj%2Ci%7D%29%2C%5C%2Ci%3Cj+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"D_{ij}=(X_{i,j},X_{j,i}),&#92;,i&lt;j \" class=\"latex\" \/> and describe the joint distribution of <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5C%7BD_%7Bij%7D%5C%7D_%7Bij%7D&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;{D_{ij}&#92;}_{ij}\" class=\"latex\" \/>, assuming all <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=D_%7Bij%7D&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"D_{ij}\" class=\"latex\" \/> are statistically\u00a0independent. This\u00a0can be done by parameterizing the probabilities<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=P%28D_%7Bij%7D%3D%281%2C1%29%29%3Dm_%7Bij%7D+%5Ctext%7B+if+%7D+i%3Cj%2C&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"P(D_{ij}=(1,1))=m_{ij} &#92;text{ if } i&lt;j,\" class=\"latex\" \/><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=P%28D_%7Bij%7D%3D%281%2C0%29%29%3Da_%7Bij%7D+%5Ctext%7B+if+%7D+i%5Cneq+j%2C&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"P(D_{ij}=(1,0))=a_{ij} &#92;text{ if } i&#92;neq j,\" class=\"latex\" \/><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=P%28D_%7Bij%7D%3D%280%2C0%29%29%3Dn_%7Bij%7D+%5Ctext%7B+if+%7D+i%3Cj%2C&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"P(D_{ij}=(0,0))=n_{ij} &#92;text{ if } i&lt;j,\" class=\"latex\" \/><\/p>\n<p>where <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=m_%7Bij%7D%2Ba_%7Bij%7D%2Ba_%7Bji%7D%2Bn_%7Bij%7D%3D1%2C%5C%2C+%5Cforall+%5C%2C+i%3Cj+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"m_{ij}+a_{ij}+a_{ji}+n_{ij}=1,&#92;, &#92;forall &#92;, i&lt;j \" class=\"latex\" \/>.<\/p>\n<p>Hence leading<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=P%28X%3Dx%29%3D%5Cprod_%7Bi%3Cj%7D+m_%7Bij%7D%5E%7Bx_%7Bij%7Dx_%7Bji%7D%7D+%5Cprod_%7Bi%5Cneq+j%7Da_%7Bij%7D%5E%7Bx_%7Bij%7D%281-x_%7Bji%7D%29%7D+%5Cprod_%7Bi%3Cj%7Dn_%7Bij%7D%5E%7B%281-x_%7Bij%7D%29%281-x_%7Bji%7D%29%7D++++%3De%5E%7B%5Csum_%7Bi%3Cj%7D+%7Bx_%7Bij%7Dx_%7Bji%7D%7D+%5Crho_%7Bij%7D+%2B+%5Csum_%7Bi%5Cneq+j%7D%7Bx_%7Bij%7D%7D+%5Ctheta_%7Bij%7D%7D+%5Cprod_%7Bi%3Cj%7Dn_%7Bij%7D%2C+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"P(X=x)=&#92;prod_{i&lt;j} m_{ij}^{x_{ij}x_{ji}} &#92;prod_{i&#92;neq j}a_{ij}^{x_{ij}(1-x_{ji})} &#92;prod_{i&lt;j}n_{ij}^{(1-x_{ij})(1-x_{ji})}    =e^{&#92;sum_{i&lt;j} {x_{ij}x_{ji}} &#92;rho_{ij} + &#92;sum_{i&#92;neq j}{x_{ij}} &#92;theta_{ij}} &#92;prod_{i&lt;j}n_{ij}, \" class=\"latex\" \/><\/p>\n<p>where <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Crho_%7Bij%7D%3Dlog%28m_%7Bij%7Dn_%7Bij%7D+%2F+a_%7Bij%7Da_%7Bji%7D%29&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;rho_{ij}=log(m_{ij}n_{ij} \/ a_{ij}a_{ji})\" class=\"latex\" \/> for <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=i%3Cj&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"i&lt;j\" class=\"latex\" \/>, and <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Ctheta_%7Bij%7D%3Dlog%28a_%7Bij%7D%2Fn_%7Bij%7D%29&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;theta_{ij}=log(a_{ij}\/n_{ij})\" class=\"latex\" \/> for <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=i%5Cneq+j&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"i&#92;neq j\" class=\"latex\" \/>.<\/p>\n<p>It can be seen that the parameters <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Crho_%7Bij%7D&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;rho_{ij}\" class=\"latex\" \/> and <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Ctheta_%7Bij%7D&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;theta_{ij}\" class=\"latex\" \/> can be interpreted as the reciprocity and differential attractiveness, respectively. With a bit of algebra we get:<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=exp%28%5Crho_%7Bij%7D%29%3D%5B+P%28X_%7Bij%7D%3D1%7CX_%7Bji%7D%3D1%29%2FP%28X_%7Bij%7D%3D1%7CX_%7Bji%7D%3D0%29+%5D%2F%5B+P%28X_%7Bij%7D%3D1%7CX_%7Bji%7D%3D0%29+%2F+P%28X_%7Bij%7D%3D0%7CX_%7Bji%7D%3D0%29+%5D%2C+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"exp(&#92;rho_{ij})=[ P(X_{ij}=1|X_{ji}=1)\/P(X_{ij}=1|X_{ji}=0) ]\/[ P(X_{ij}=1|X_{ji}=0) \/ P(X_{ij}=0|X_{ji}=0) ], \" class=\"latex\" \/><br \/>\nand<br \/>\n<img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=exp%28%5Ctheta_%7Bij%7D%29%3DP%28X_%7Bij%7D%3D1%7CX_%7Bji%7D%3D0%29%2FP%28X_%7Bij%7D%3D0%7CX_%7Bji%7D%3D0%29.+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"exp(&#92;theta_{ij})=P(X_{ij}=1|X_{ji}=0)\/P(X_{ij}=0|X_{ji}=0). \" class=\"latex\" \/><\/p>\n<p>Now, if we consider the following restrictions:<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Crho_%7Bij%7D%3D%5Crho%2C%5C%2C+%5Cforall+i%3Cj&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;rho_{ij}=&#92;rho,&#92;, &#92;forall i&lt;j\" class=\"latex\" \/>, and <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Ctheta_%7Bij%7D%3D%5Ctheta%2B%5Calpha_i+%2B+%5Cbeta_j%2C%5C%2C+%5Cforall+i%5Cneq+j+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;theta_{ij}=&#92;theta+&#92;alpha_i + &#92;beta_j,&#92;, &#92;forall i&#92;neq j \" class=\"latex\" \/> where <img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=%5Calpha_%2A%3D%5Cbeta_%2A%3D0+&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"&#92;alpha_*=&#92;beta_*=0 \" class=\"latex\" \/>.<\/p>\n<p>With some algebra we get the proposed form of the model<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/s0.wp.com\/latex.php?latex=p_1%28x%29%3DP%28X%3Dx%29+%5Cpropto+e%5E%7B%5Crho+m+%2B+%5Ctheta+x_%7B%2A%2A%7D+%2B+%5Csum_i+%5Calpha_i+x_%7Bi%2A%7D+%2B+%5Csum_j+%5Cbeta_j+x_%7B%2Aj%7D%7D.&#038;bg=ffffff&#038;fg=000&#038;s=0&#038;c=20201002\" alt=\"p_1(x)=P(X=x) &#92;propto e^{&#92;rho m + &#92;theta x_{**} + &#92;sum_i &#92;alpha_i x_{i*} + &#92;sum_j &#92;beta_j x_{*j}}.\" class=\"latex\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The article\u00a0An Exponential Family of Probability Distributions for Directed Graphs, published by Holland and Leinhardt (1981), set the foundation for the now known exponential random graph models (ERGM) or p* models, which model jointly the whole adjacency matrix (or graph) . In this article they proposed an exponential family of probability distributions to model , [&hellip;]<\/p>\n","protected":false},"author":26,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"nf_dc_page":"","wikipediapreview_detectlinks":true,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"ngg_post_thumbnail":0,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[10],"tags":[],"ppma_author":[514],"class_list":["post-2098","post","type-post","status-publish","format-standard","hentry","category-groupmeetings"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"authors":[{"term_id":514,"user_id":26,"is_guest":0,"slug":"luis","display_name":"Luis Ospina Forero","avatar_url":"https:\/\/secure.gravatar.com\/avatar\/310cef32cd5dac5a383fe35d2e6fa0ed40cb03d0712d2b5a5ef81092db812b3e?s=96&d=mm&r=g","0":null,"1":"","2":"","3":"","4":"","5":"","6":"","7":"","8":""}],"_links":{"self":[{"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/posts\/2098","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/users\/26"}],"replies":[{"embeddable":true,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/comments?post=2098"}],"version-history":[{"count":32,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/posts\/2098\/revisions"}],"predecessor-version":[{"id":2136,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/posts\/2098\/revisions\/2136"}],"wp:attachment":[{"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/media?parent=2098"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/categories?post=2098"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/tags?post=2098"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/ppma_author?post=2098"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}