Why was the term “discrete” used in discrete logarithm?Does classifying an integer as a discrete log require it be part of a multiplicative group?Why is the discrete logarithm problem assumed to be hard?Why is NON DISCRETE logarithm problem not hard as the DISCRETE logarithm problem (so computationally hard)?How to construct a hash function into a cyclic group such that its discrete log is intractable?Discrete logarithm key sizes for very short term usageOn discrete logarithm problemDiscrete Logarithm NotationDescribing Discrete Logarithm AssumptionSpecific discrete logarithm question

How can an attacker use robots.txt?

Why are there two fundamental laws of logic?

How to deal with a Homophobic PC

2000s Animated TV show where teenagers could physically go into a virtual world

What is the meaning of "heutig" in this sentence?

Why would Evaluate[] in Plot give me warning about precision?

If an object moving in a circle experiences centripetal force, then doesn't it also experience centrifugal force, because of Newton's third law?

What do you do if you have developments on your paper during the long peer review process?

Painting a 4x6 grid with 2 colours

Is it true that, "just ten trading days represent 63 per cent of the returns of the past 50 years"?

Meaning of 'ran' in German?

What exactly did this mechanic sabotage on the American Airlines 737, and how dangerous was it?

1, 2, 4, 8, 16, ... 33?

How to manage expenditure when billing cycles and paycheck cycles are not aligned?

What is the meaning of word 'crack' in chapter 33 of A Game of Thrones?

Do we know the situation in Britain before Sealion (summer 1940)?

Is It Possible to Have Different Sea Levels, Eventually Causing New Landforms to Appear?

A food item only made possible by time-freezing storage?

Can an integer optimization problem be convex?

Where Does VDD+0.3V Input Limit Come From on IC chips?

To what extent is it worthwhile to report check fraud / refund scams?

Could Apollo astronauts see city lights from the moon?

Writing a letter of recommendation for a mediocre student

Why does this image of Jupiter look so strange?



Why was the term “discrete” used in discrete logarithm?


Does classifying an integer as a discrete log require it be part of a multiplicative group?Why is the discrete logarithm problem assumed to be hard?Why is NON DISCRETE logarithm problem not hard as the DISCRETE logarithm problem (so computationally hard)?How to construct a hash function into a cyclic group such that its discrete log is intractable?Discrete logarithm key sizes for very short term usageOn discrete logarithm problemDiscrete Logarithm NotationDescribing Discrete Logarithm AssumptionSpecific discrete logarithm question






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








11












$begingroup$


Is there anything especially "discrete" about a discrete logarithm? This is not a question of what is a discrete logarithm or why the discrete logarithm problem is an "intractable problem" given certain circumstances. I'm just trying to determine if there's some additional meaning to the term "discrete" as it's used in name discrete logarithm?



The definition of "discrete" is "individually separate and distinct". Could it be that the term "discrete" is a reference to the least non-negative residues of a modulus or the order of points for a particular cyclic group on an elliptic curve?










share|improve this question









$endgroup$









  • 13




    $begingroup$
    Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
    $endgroup$
    – Mikero
    Apr 15 at 20:18






  • 6




    $begingroup$
    See also discrete mathematics
    $endgroup$
    – BlueRaja - Danny Pflughoeft
    Apr 15 at 22:48










  • $begingroup$
    That's a pretty discreet information.
    $endgroup$
    – yo'
    Apr 17 at 10:26

















11












$begingroup$


Is there anything especially "discrete" about a discrete logarithm? This is not a question of what is a discrete logarithm or why the discrete logarithm problem is an "intractable problem" given certain circumstances. I'm just trying to determine if there's some additional meaning to the term "discrete" as it's used in name discrete logarithm?



The definition of "discrete" is "individually separate and distinct". Could it be that the term "discrete" is a reference to the least non-negative residues of a modulus or the order of points for a particular cyclic group on an elliptic curve?










share|improve this question









$endgroup$









  • 13




    $begingroup$
    Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
    $endgroup$
    – Mikero
    Apr 15 at 20:18






  • 6




    $begingroup$
    See also discrete mathematics
    $endgroup$
    – BlueRaja - Danny Pflughoeft
    Apr 15 at 22:48










  • $begingroup$
    That's a pretty discreet information.
    $endgroup$
    – yo'
    Apr 17 at 10:26













11












11








11


2



$begingroup$


Is there anything especially "discrete" about a discrete logarithm? This is not a question of what is a discrete logarithm or why the discrete logarithm problem is an "intractable problem" given certain circumstances. I'm just trying to determine if there's some additional meaning to the term "discrete" as it's used in name discrete logarithm?



The definition of "discrete" is "individually separate and distinct". Could it be that the term "discrete" is a reference to the least non-negative residues of a modulus or the order of points for a particular cyclic group on an elliptic curve?










share|improve this question









$endgroup$




Is there anything especially "discrete" about a discrete logarithm? This is not a question of what is a discrete logarithm or why the discrete logarithm problem is an "intractable problem" given certain circumstances. I'm just trying to determine if there's some additional meaning to the term "discrete" as it's used in name discrete logarithm?



The definition of "discrete" is "individually separate and distinct". Could it be that the term "discrete" is a reference to the least non-negative residues of a modulus or the order of points for a particular cyclic group on an elliptic curve?







discrete-logarithm terminology






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Apr 15 at 20:09









JohnGaltJohnGalt

3212 silver badges8 bronze badges




3212 silver badges8 bronze badges










  • 13




    $begingroup$
    Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
    $endgroup$
    – Mikero
    Apr 15 at 20:18






  • 6




    $begingroup$
    See also discrete mathematics
    $endgroup$
    – BlueRaja - Danny Pflughoeft
    Apr 15 at 22:48










  • $begingroup$
    That's a pretty discreet information.
    $endgroup$
    – yo'
    Apr 17 at 10:26












  • 13




    $begingroup$
    Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
    $endgroup$
    – Mikero
    Apr 15 at 20:18






  • 6




    $begingroup$
    See also discrete mathematics
    $endgroup$
    – BlueRaja - Danny Pflughoeft
    Apr 15 at 22:48










  • $begingroup$
    That's a pretty discreet information.
    $endgroup$
    – yo'
    Apr 17 at 10:26







13




13




$begingroup$
Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
$endgroup$
– Mikero
Apr 15 at 20:18




$begingroup$
Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
$endgroup$
– Mikero
Apr 15 at 20:18




6




6




$begingroup$
See also discrete mathematics
$endgroup$
– BlueRaja - Danny Pflughoeft
Apr 15 at 22:48




$begingroup$
See also discrete mathematics
$endgroup$
– BlueRaja - Danny Pflughoeft
Apr 15 at 22:48












$begingroup$
That's a pretty discreet information.
$endgroup$
– yo'
Apr 17 at 10:26




$begingroup$
That's a pretty discreet information.
$endgroup$
– yo'
Apr 17 at 10:26










3 Answers
3






active

oldest

votes


















25














$begingroup$

The word discrete is used as an antonym of 'continuous', that is, it is the normal logarithmic problem, just over a discrete group.



The standard logarithmic problem is over the infinite group $mathbbR^*$, this group is called 'continuous', because for any element $x$, there are other elements that are arbitrarily close to it.



The discrete logarithmic problem is over a finite group (for example, $mathbbZ_p^*$); in contrast to $mathbbR^*$, we don't have group elements arbitrarily close together; we call this type of group 'discrete'.






share|improve this answer









$endgroup$










  • 1




    $begingroup$
    yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
    $endgroup$
    – Geoffroy Couteau
    Apr 15 at 20:51






  • 8




    $begingroup$
    @JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
    $endgroup$
    – Mark
    Apr 15 at 20:51






  • 2




    $begingroup$
    @GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
    $endgroup$
    – Mark
    Apr 15 at 21:04






  • 3




    $begingroup$
    @Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
    $endgroup$
    – Geoffroy Couteau
    Apr 15 at 21:08






  • 2




    $begingroup$
    @JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
    $endgroup$
    – ruakh
    Apr 16 at 0:01


















12














$begingroup$

While I agree completely with poncho's answer, this other viewpoint might be useful.
Specifically, I think a better comparison isn't between $mathbbZ_p^*$ and $mathbbR^*$, but with $mathbbZ_p^*$ and $S^1$. We can view $S^1 cong zinmathbbC mid $. It's not hard to show that any $zin S^1$ is able to be written as $z = exp(2pi i t)$ for $tinmathbbR$ (we don't strictly need the factor $2pi$ here, but it's traditional). Due to $exp(x)$ being periodic, it's in fact enough to have $tin[0,1)$.



This has an obvious group structure, in that:
$$exp(2pi i t_0)exp(2pi i t_1) = exp(2pi i (t_0+t_1))$$
If we're making the restriction that $t_iin[0,1)$, then we have to take $t_0+t_1mod 1$, but this is fairly standard.



More than just having an obvious group structure, we actually have that any $mathbbZ_p^*$ injects into it.
Specifically, we always have:
$$
phi_p:mathbbZ_p^*to S^1,quad phi_p(x) = exp(2pi i x/(p-1))
$$

Here, $p-1$ in the denominator is because $|mathbbZ_p^*| = p-1$.
We can define the discrete logarithm problem for both of these groups in the standard way (here, it's important to restrict $t_iin[0, 1)$ if we want a unique answer).
Then, we can relate these problems to each via the aforementioned injection.
Through this image, we see that $S^1$ is "continuous" in the sense that it takes up the full circle, but the image of $mathbbZ_p^*$ in $S^1$ will always be "discrete" --- there will always be "some space" between points (they can't get arbitrarily close).






share|improve this answer









$endgroup$










  • 1




    $begingroup$
    A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
    $endgroup$
    – supercat
    Apr 16 at 15:32



















2














$begingroup$

Just to add to the other answers, (as mentioned in some of the comments) it is exactly the discreteness of the discrete log problem is that makes it (for some parameter choices) hard. Computing $y = log_a(x)$ is the same as solving the equation $a^y = x$ for $y$. In the non-discrete case, $y mapsto a^y$ is a monotonically increasing (if $a > 1$) continuous function. Thus, you can (in the absence of even more efficient methods) use the bisection method to solve for $y$. When you have a value $y$ for which $a^y$ is close to the target $x$ then you know that $y$ is close to the value you seek. Knowing when you are close to a solution is very useful information.



In the discrete case, there is no corresponding notion of closeness. Say if for some reason you wanted to compute the base-$19$ discrete log of $7155$ (mod $34591$) and somehow find that $19^481 = 7156$ (mod $34591$). Does this imply that $log_19(7155)$ is close to $481$? Not at all. The actual value is $log_19(7155) = 28544$. It is much harder to find a solution when you can't tell when you are close.






share|improve this answer











$endgroup$














  • $begingroup$
    Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
    $endgroup$
    – JohnGalt
    Apr 17 at 15:34






  • 1




    $begingroup$
    @JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
    $endgroup$
    – John Coleman
    Apr 17 at 16:21













Your Answer








StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "281"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);














draft saved

draft discarded
















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f68801%2fwhy-was-the-term-discrete-used-in-discrete-logarithm%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























3 Answers
3






active

oldest

votes








3 Answers
3






active

oldest

votes









active

oldest

votes






active

oldest

votes









25














$begingroup$

The word discrete is used as an antonym of 'continuous', that is, it is the normal logarithmic problem, just over a discrete group.



The standard logarithmic problem is over the infinite group $mathbbR^*$, this group is called 'continuous', because for any element $x$, there are other elements that are arbitrarily close to it.



The discrete logarithmic problem is over a finite group (for example, $mathbbZ_p^*$); in contrast to $mathbbR^*$, we don't have group elements arbitrarily close together; we call this type of group 'discrete'.






share|improve this answer









$endgroup$










  • 1




    $begingroup$
    yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
    $endgroup$
    – Geoffroy Couteau
    Apr 15 at 20:51






  • 8




    $begingroup$
    @JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
    $endgroup$
    – Mark
    Apr 15 at 20:51






  • 2




    $begingroup$
    @GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
    $endgroup$
    – Mark
    Apr 15 at 21:04






  • 3




    $begingroup$
    @Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
    $endgroup$
    – Geoffroy Couteau
    Apr 15 at 21:08






  • 2




    $begingroup$
    @JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
    $endgroup$
    – ruakh
    Apr 16 at 0:01















25














$begingroup$

The word discrete is used as an antonym of 'continuous', that is, it is the normal logarithmic problem, just over a discrete group.



The standard logarithmic problem is over the infinite group $mathbbR^*$, this group is called 'continuous', because for any element $x$, there are other elements that are arbitrarily close to it.



The discrete logarithmic problem is over a finite group (for example, $mathbbZ_p^*$); in contrast to $mathbbR^*$, we don't have group elements arbitrarily close together; we call this type of group 'discrete'.






share|improve this answer









$endgroup$










  • 1




    $begingroup$
    yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
    $endgroup$
    – Geoffroy Couteau
    Apr 15 at 20:51






  • 8




    $begingroup$
    @JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
    $endgroup$
    – Mark
    Apr 15 at 20:51






  • 2




    $begingroup$
    @GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
    $endgroup$
    – Mark
    Apr 15 at 21:04






  • 3




    $begingroup$
    @Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
    $endgroup$
    – Geoffroy Couteau
    Apr 15 at 21:08






  • 2




    $begingroup$
    @JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
    $endgroup$
    – ruakh
    Apr 16 at 0:01













25














25










25







$begingroup$

The word discrete is used as an antonym of 'continuous', that is, it is the normal logarithmic problem, just over a discrete group.



The standard logarithmic problem is over the infinite group $mathbbR^*$, this group is called 'continuous', because for any element $x$, there are other elements that are arbitrarily close to it.



The discrete logarithmic problem is over a finite group (for example, $mathbbZ_p^*$); in contrast to $mathbbR^*$, we don't have group elements arbitrarily close together; we call this type of group 'discrete'.






share|improve this answer









$endgroup$



The word discrete is used as an antonym of 'continuous', that is, it is the normal logarithmic problem, just over a discrete group.



The standard logarithmic problem is over the infinite group $mathbbR^*$, this group is called 'continuous', because for any element $x$, there are other elements that are arbitrarily close to it.



The discrete logarithmic problem is over a finite group (for example, $mathbbZ_p^*$); in contrast to $mathbbR^*$, we don't have group elements arbitrarily close together; we call this type of group 'discrete'.







share|improve this answer












share|improve this answer



share|improve this answer










answered Apr 15 at 20:18









ponchoponcho

99.6k3 gold badges163 silver badges261 bronze badges




99.6k3 gold badges163 silver badges261 bronze badges










  • 1




    $begingroup$
    yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
    $endgroup$
    – Geoffroy Couteau
    Apr 15 at 20:51






  • 8




    $begingroup$
    @JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
    $endgroup$
    – Mark
    Apr 15 at 20:51






  • 2




    $begingroup$
    @GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
    $endgroup$
    – Mark
    Apr 15 at 21:04






  • 3




    $begingroup$
    @Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
    $endgroup$
    – Geoffroy Couteau
    Apr 15 at 21:08






  • 2




    $begingroup$
    @JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
    $endgroup$
    – ruakh
    Apr 16 at 0:01












  • 1




    $begingroup$
    yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
    $endgroup$
    – Geoffroy Couteau
    Apr 15 at 20:51






  • 8




    $begingroup$
    @JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
    $endgroup$
    – Mark
    Apr 15 at 20:51






  • 2




    $begingroup$
    @GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
    $endgroup$
    – Mark
    Apr 15 at 21:04






  • 3




    $begingroup$
    @Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
    $endgroup$
    – Geoffroy Couteau
    Apr 15 at 21:08






  • 2




    $begingroup$
    @JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
    $endgroup$
    – ruakh
    Apr 16 at 0:01







1




1




$begingroup$
yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
$endgroup$
– Geoffroy Couteau
Apr 15 at 20:51




$begingroup$
yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
$endgroup$
– Geoffroy Couteau
Apr 15 at 20:51




8




8




$begingroup$
@JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
$endgroup$
– Mark
Apr 15 at 20:51




$begingroup$
@JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
$endgroup$
– Mark
Apr 15 at 20:51




2




2




$begingroup$
@GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
$endgroup$
– Mark
Apr 15 at 21:04




$begingroup$
@GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
$endgroup$
– Mark
Apr 15 at 21:04




3




3




$begingroup$
@Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
$endgroup$
– Geoffroy Couteau
Apr 15 at 21:08




$begingroup$
@Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
$endgroup$
– Geoffroy Couteau
Apr 15 at 21:08




2




2




$begingroup$
@JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
$endgroup$
– ruakh
Apr 16 at 0:01




$begingroup$
@JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
$endgroup$
– ruakh
Apr 16 at 0:01













12














$begingroup$

While I agree completely with poncho's answer, this other viewpoint might be useful.
Specifically, I think a better comparison isn't between $mathbbZ_p^*$ and $mathbbR^*$, but with $mathbbZ_p^*$ and $S^1$. We can view $S^1 cong zinmathbbC mid $. It's not hard to show that any $zin S^1$ is able to be written as $z = exp(2pi i t)$ for $tinmathbbR$ (we don't strictly need the factor $2pi$ here, but it's traditional). Due to $exp(x)$ being periodic, it's in fact enough to have $tin[0,1)$.



This has an obvious group structure, in that:
$$exp(2pi i t_0)exp(2pi i t_1) = exp(2pi i (t_0+t_1))$$
If we're making the restriction that $t_iin[0,1)$, then we have to take $t_0+t_1mod 1$, but this is fairly standard.



More than just having an obvious group structure, we actually have that any $mathbbZ_p^*$ injects into it.
Specifically, we always have:
$$
phi_p:mathbbZ_p^*to S^1,quad phi_p(x) = exp(2pi i x/(p-1))
$$

Here, $p-1$ in the denominator is because $|mathbbZ_p^*| = p-1$.
We can define the discrete logarithm problem for both of these groups in the standard way (here, it's important to restrict $t_iin[0, 1)$ if we want a unique answer).
Then, we can relate these problems to each via the aforementioned injection.
Through this image, we see that $S^1$ is "continuous" in the sense that it takes up the full circle, but the image of $mathbbZ_p^*$ in $S^1$ will always be "discrete" --- there will always be "some space" between points (they can't get arbitrarily close).






share|improve this answer









$endgroup$










  • 1




    $begingroup$
    A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
    $endgroup$
    – supercat
    Apr 16 at 15:32
















12














$begingroup$

While I agree completely with poncho's answer, this other viewpoint might be useful.
Specifically, I think a better comparison isn't between $mathbbZ_p^*$ and $mathbbR^*$, but with $mathbbZ_p^*$ and $S^1$. We can view $S^1 cong zinmathbbC mid $. It's not hard to show that any $zin S^1$ is able to be written as $z = exp(2pi i t)$ for $tinmathbbR$ (we don't strictly need the factor $2pi$ here, but it's traditional). Due to $exp(x)$ being periodic, it's in fact enough to have $tin[0,1)$.



This has an obvious group structure, in that:
$$exp(2pi i t_0)exp(2pi i t_1) = exp(2pi i (t_0+t_1))$$
If we're making the restriction that $t_iin[0,1)$, then we have to take $t_0+t_1mod 1$, but this is fairly standard.



More than just having an obvious group structure, we actually have that any $mathbbZ_p^*$ injects into it.
Specifically, we always have:
$$
phi_p:mathbbZ_p^*to S^1,quad phi_p(x) = exp(2pi i x/(p-1))
$$

Here, $p-1$ in the denominator is because $|mathbbZ_p^*| = p-1$.
We can define the discrete logarithm problem for both of these groups in the standard way (here, it's important to restrict $t_iin[0, 1)$ if we want a unique answer).
Then, we can relate these problems to each via the aforementioned injection.
Through this image, we see that $S^1$ is "continuous" in the sense that it takes up the full circle, but the image of $mathbbZ_p^*$ in $S^1$ will always be "discrete" --- there will always be "some space" between points (they can't get arbitrarily close).






share|improve this answer









$endgroup$










  • 1




    $begingroup$
    A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
    $endgroup$
    – supercat
    Apr 16 at 15:32














12














12










12







$begingroup$

While I agree completely with poncho's answer, this other viewpoint might be useful.
Specifically, I think a better comparison isn't between $mathbbZ_p^*$ and $mathbbR^*$, but with $mathbbZ_p^*$ and $S^1$. We can view $S^1 cong zinmathbbC mid $. It's not hard to show that any $zin S^1$ is able to be written as $z = exp(2pi i t)$ for $tinmathbbR$ (we don't strictly need the factor $2pi$ here, but it's traditional). Due to $exp(x)$ being periodic, it's in fact enough to have $tin[0,1)$.



This has an obvious group structure, in that:
$$exp(2pi i t_0)exp(2pi i t_1) = exp(2pi i (t_0+t_1))$$
If we're making the restriction that $t_iin[0,1)$, then we have to take $t_0+t_1mod 1$, but this is fairly standard.



More than just having an obvious group structure, we actually have that any $mathbbZ_p^*$ injects into it.
Specifically, we always have:
$$
phi_p:mathbbZ_p^*to S^1,quad phi_p(x) = exp(2pi i x/(p-1))
$$

Here, $p-1$ in the denominator is because $|mathbbZ_p^*| = p-1$.
We can define the discrete logarithm problem for both of these groups in the standard way (here, it's important to restrict $t_iin[0, 1)$ if we want a unique answer).
Then, we can relate these problems to each via the aforementioned injection.
Through this image, we see that $S^1$ is "continuous" in the sense that it takes up the full circle, but the image of $mathbbZ_p^*$ in $S^1$ will always be "discrete" --- there will always be "some space" between points (they can't get arbitrarily close).






share|improve this answer









$endgroup$



While I agree completely with poncho's answer, this other viewpoint might be useful.
Specifically, I think a better comparison isn't between $mathbbZ_p^*$ and $mathbbR^*$, but with $mathbbZ_p^*$ and $S^1$. We can view $S^1 cong zinmathbbC mid $. It's not hard to show that any $zin S^1$ is able to be written as $z = exp(2pi i t)$ for $tinmathbbR$ (we don't strictly need the factor $2pi$ here, but it's traditional). Due to $exp(x)$ being periodic, it's in fact enough to have $tin[0,1)$.



This has an obvious group structure, in that:
$$exp(2pi i t_0)exp(2pi i t_1) = exp(2pi i (t_0+t_1))$$
If we're making the restriction that $t_iin[0,1)$, then we have to take $t_0+t_1mod 1$, but this is fairly standard.



More than just having an obvious group structure, we actually have that any $mathbbZ_p^*$ injects into it.
Specifically, we always have:
$$
phi_p:mathbbZ_p^*to S^1,quad phi_p(x) = exp(2pi i x/(p-1))
$$

Here, $p-1$ in the denominator is because $|mathbbZ_p^*| = p-1$.
We can define the discrete logarithm problem for both of these groups in the standard way (here, it's important to restrict $t_iin[0, 1)$ if we want a unique answer).
Then, we can relate these problems to each via the aforementioned injection.
Through this image, we see that $S^1$ is "continuous" in the sense that it takes up the full circle, but the image of $mathbbZ_p^*$ in $S^1$ will always be "discrete" --- there will always be "some space" between points (they can't get arbitrarily close).







share|improve this answer












share|improve this answer



share|improve this answer










answered Apr 15 at 20:49









MarkMark

5032 silver badges8 bronze badges




5032 silver badges8 bronze badges










  • 1




    $begingroup$
    A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
    $endgroup$
    – supercat
    Apr 16 at 15:32













  • 1




    $begingroup$
    A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
    $endgroup$
    – supercat
    Apr 16 at 15:32








1




1




$begingroup$
A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
$endgroup$
– supercat
Apr 16 at 15:32





$begingroup$
A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
$endgroup$
– supercat
Apr 16 at 15:32












2














$begingroup$

Just to add to the other answers, (as mentioned in some of the comments) it is exactly the discreteness of the discrete log problem is that makes it (for some parameter choices) hard. Computing $y = log_a(x)$ is the same as solving the equation $a^y = x$ for $y$. In the non-discrete case, $y mapsto a^y$ is a monotonically increasing (if $a > 1$) continuous function. Thus, you can (in the absence of even more efficient methods) use the bisection method to solve for $y$. When you have a value $y$ for which $a^y$ is close to the target $x$ then you know that $y$ is close to the value you seek. Knowing when you are close to a solution is very useful information.



In the discrete case, there is no corresponding notion of closeness. Say if for some reason you wanted to compute the base-$19$ discrete log of $7155$ (mod $34591$) and somehow find that $19^481 = 7156$ (mod $34591$). Does this imply that $log_19(7155)$ is close to $481$? Not at all. The actual value is $log_19(7155) = 28544$. It is much harder to find a solution when you can't tell when you are close.






share|improve this answer











$endgroup$














  • $begingroup$
    Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
    $endgroup$
    – JohnGalt
    Apr 17 at 15:34






  • 1




    $begingroup$
    @JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
    $endgroup$
    – John Coleman
    Apr 17 at 16:21















2














$begingroup$

Just to add to the other answers, (as mentioned in some of the comments) it is exactly the discreteness of the discrete log problem is that makes it (for some parameter choices) hard. Computing $y = log_a(x)$ is the same as solving the equation $a^y = x$ for $y$. In the non-discrete case, $y mapsto a^y$ is a monotonically increasing (if $a > 1$) continuous function. Thus, you can (in the absence of even more efficient methods) use the bisection method to solve for $y$. When you have a value $y$ for which $a^y$ is close to the target $x$ then you know that $y$ is close to the value you seek. Knowing when you are close to a solution is very useful information.



In the discrete case, there is no corresponding notion of closeness. Say if for some reason you wanted to compute the base-$19$ discrete log of $7155$ (mod $34591$) and somehow find that $19^481 = 7156$ (mod $34591$). Does this imply that $log_19(7155)$ is close to $481$? Not at all. The actual value is $log_19(7155) = 28544$. It is much harder to find a solution when you can't tell when you are close.






share|improve this answer











$endgroup$














  • $begingroup$
    Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
    $endgroup$
    – JohnGalt
    Apr 17 at 15:34






  • 1




    $begingroup$
    @JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
    $endgroup$
    – John Coleman
    Apr 17 at 16:21













2














2










2







$begingroup$

Just to add to the other answers, (as mentioned in some of the comments) it is exactly the discreteness of the discrete log problem is that makes it (for some parameter choices) hard. Computing $y = log_a(x)$ is the same as solving the equation $a^y = x$ for $y$. In the non-discrete case, $y mapsto a^y$ is a monotonically increasing (if $a > 1$) continuous function. Thus, you can (in the absence of even more efficient methods) use the bisection method to solve for $y$. When you have a value $y$ for which $a^y$ is close to the target $x$ then you know that $y$ is close to the value you seek. Knowing when you are close to a solution is very useful information.



In the discrete case, there is no corresponding notion of closeness. Say if for some reason you wanted to compute the base-$19$ discrete log of $7155$ (mod $34591$) and somehow find that $19^481 = 7156$ (mod $34591$). Does this imply that $log_19(7155)$ is close to $481$? Not at all. The actual value is $log_19(7155) = 28544$. It is much harder to find a solution when you can't tell when you are close.






share|improve this answer











$endgroup$



Just to add to the other answers, (as mentioned in some of the comments) it is exactly the discreteness of the discrete log problem is that makes it (for some parameter choices) hard. Computing $y = log_a(x)$ is the same as solving the equation $a^y = x$ for $y$. In the non-discrete case, $y mapsto a^y$ is a monotonically increasing (if $a > 1$) continuous function. Thus, you can (in the absence of even more efficient methods) use the bisection method to solve for $y$. When you have a value $y$ for which $a^y$ is close to the target $x$ then you know that $y$ is close to the value you seek. Knowing when you are close to a solution is very useful information.



In the discrete case, there is no corresponding notion of closeness. Say if for some reason you wanted to compute the base-$19$ discrete log of $7155$ (mod $34591$) and somehow find that $19^481 = 7156$ (mod $34591$). Does this imply that $log_19(7155)$ is close to $481$? Not at all. The actual value is $log_19(7155) = 28544$. It is much harder to find a solution when you can't tell when you are close.







share|improve this answer














share|improve this answer



share|improve this answer








edited Apr 17 at 11:32

























answered Apr 17 at 11:25









John ColemanJohn Coleman

2411 silver badge5 bronze badges




2411 silver badge5 bronze badges














  • $begingroup$
    Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
    $endgroup$
    – JohnGalt
    Apr 17 at 15:34






  • 1




    $begingroup$
    @JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
    $endgroup$
    – John Coleman
    Apr 17 at 16:21
















  • $begingroup$
    Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
    $endgroup$
    – JohnGalt
    Apr 17 at 15:34






  • 1




    $begingroup$
    @JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
    $endgroup$
    – John Coleman
    Apr 17 at 16:21















$begingroup$
Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
$endgroup$
– JohnGalt
Apr 17 at 15:34




$begingroup$
Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
$endgroup$
– JohnGalt
Apr 17 at 15:34




1




1




$begingroup$
@JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
$endgroup$
– John Coleman
Apr 17 at 16:21




$begingroup$
@JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
$endgroup$
– John Coleman
Apr 17 at 16:21


















draft saved

draft discarded















































Thanks for contributing an answer to Cryptography Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f68801%2fwhy-was-the-term-discrete-used-in-discrete-logarithm%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Tamil (spriik) Luke uk diar | Nawigatjuun

Align equal signs while including text over equalitiesAMS align: left aligned text/math plus multicolumn alignmentMultiple alignmentsAligning equations in multiple placesNumbering and aligning an equation with multiple columnsHow to align one equation with another multline equationUsing \ in environments inside the begintabularxNumber equations and preserving alignment of equal signsHow can I align equations to the left and to the right?Double equation alignment problem within align enviromentAligned within align: Why are they right-aligned?

Where does the image of a data connector as a sharp metal spike originate from?Where does the concept of infected people turning into zombies only after death originate from?Where does the motif of a reanimated human head originate?Where did the notion that Dragons could speak originate?Where does the archetypal image of the 'Grey' alien come from?Where did the suffix '-Man' originate?Where does the notion of being injured or killed by an illusion originate?Where did the term “sophont” originate?Where does the trope of magic spells being driven by advanced technology originate from?Where did the term “the living impaired” originate?