In Linear Algebra, what is a vector?
$begingroup$
I understand that a vector space is a collection of vectors that can be added and scalar multiplied and satisfies the 8 axioms, however, I do not know what a vector is.
I know in physics a vector is a geometric object that has a magnitude and a direction and it computer science a vector is a container that holds elements, expand, or shrink, but in linear algebra the definition of a vector isn't too clear.
As a result, what is a vector in Linear Algebra?
linear-algebra vector-spaces
$endgroup$
|
show 5 more comments
$begingroup$
I understand that a vector space is a collection of vectors that can be added and scalar multiplied and satisfies the 8 axioms, however, I do not know what a vector is.
I know in physics a vector is a geometric object that has a magnitude and a direction and it computer science a vector is a container that holds elements, expand, or shrink, but in linear algebra the definition of a vector isn't too clear.
As a result, what is a vector in Linear Algebra?
linear-algebra vector-spaces
$endgroup$
98
$begingroup$
A vector is simply an element of a vector space.
$endgroup$
– Mariano Suárez-Álvarez
Sep 22 '16 at 18:59
5
$begingroup$
Do not think too much about what a vector or a vector space is. It is just an algebraic structure which seems to have extremely interesting properties, which then are applied everywhere in science.
$endgroup$
– user305860
Sep 22 '16 at 19:00
13
$begingroup$
"a vector space is a collection of vectors that..." You should avoid using the word "vectors" there, and simply say, "A vector space is a set $S$ together with a field $F$ and functions $+$ and $times$ such that the following axioms are satisfied..." Then you have avoided using the word "vector", so you don't need to define it. If you'd like you can refer to the elements of $S$ as "vectors" to remind people that $(S,F,+,times)$ is a vector space.
$endgroup$
– littleO
Sep 22 '16 at 23:36
4
$begingroup$
The reason is that physicists care about the definition of a vector. In linear algebra, the definition of a vector is irrelevant, what is important is the definition of a vector space.
$endgroup$
– Asaf Karagila♦
Sep 23 '16 at 4:53
1
$begingroup$
@AsafKaragila: Sort of. Actually in modern physics they care that a quantity transforms like a vector. So ultimately it's still defined by behaviour, just that it takes more behaviour than just the vector space axioms.
$endgroup$
– celtschk
Sep 23 '16 at 6:27
|
show 5 more comments
$begingroup$
I understand that a vector space is a collection of vectors that can be added and scalar multiplied and satisfies the 8 axioms, however, I do not know what a vector is.
I know in physics a vector is a geometric object that has a magnitude and a direction and it computer science a vector is a container that holds elements, expand, or shrink, but in linear algebra the definition of a vector isn't too clear.
As a result, what is a vector in Linear Algebra?
linear-algebra vector-spaces
$endgroup$
I understand that a vector space is a collection of vectors that can be added and scalar multiplied and satisfies the 8 axioms, however, I do not know what a vector is.
I know in physics a vector is a geometric object that has a magnitude and a direction and it computer science a vector is a container that holds elements, expand, or shrink, but in linear algebra the definition of a vector isn't too clear.
As a result, what is a vector in Linear Algebra?
linear-algebra vector-spaces
linear-algebra vector-spaces
edited Sep 23 '16 at 18:09
Aaron Hall
711615
711615
asked Sep 22 '16 at 18:57
Paul LeePaul Lee
475154
475154
98
$begingroup$
A vector is simply an element of a vector space.
$endgroup$
– Mariano Suárez-Álvarez
Sep 22 '16 at 18:59
5
$begingroup$
Do not think too much about what a vector or a vector space is. It is just an algebraic structure which seems to have extremely interesting properties, which then are applied everywhere in science.
$endgroup$
– user305860
Sep 22 '16 at 19:00
13
$begingroup$
"a vector space is a collection of vectors that..." You should avoid using the word "vectors" there, and simply say, "A vector space is a set $S$ together with a field $F$ and functions $+$ and $times$ such that the following axioms are satisfied..." Then you have avoided using the word "vector", so you don't need to define it. If you'd like you can refer to the elements of $S$ as "vectors" to remind people that $(S,F,+,times)$ is a vector space.
$endgroup$
– littleO
Sep 22 '16 at 23:36
4
$begingroup$
The reason is that physicists care about the definition of a vector. In linear algebra, the definition of a vector is irrelevant, what is important is the definition of a vector space.
$endgroup$
– Asaf Karagila♦
Sep 23 '16 at 4:53
1
$begingroup$
@AsafKaragila: Sort of. Actually in modern physics they care that a quantity transforms like a vector. So ultimately it's still defined by behaviour, just that it takes more behaviour than just the vector space axioms.
$endgroup$
– celtschk
Sep 23 '16 at 6:27
|
show 5 more comments
98
$begingroup$
A vector is simply an element of a vector space.
$endgroup$
– Mariano Suárez-Álvarez
Sep 22 '16 at 18:59
5
$begingroup$
Do not think too much about what a vector or a vector space is. It is just an algebraic structure which seems to have extremely interesting properties, which then are applied everywhere in science.
$endgroup$
– user305860
Sep 22 '16 at 19:00
13
$begingroup$
"a vector space is a collection of vectors that..." You should avoid using the word "vectors" there, and simply say, "A vector space is a set $S$ together with a field $F$ and functions $+$ and $times$ such that the following axioms are satisfied..." Then you have avoided using the word "vector", so you don't need to define it. If you'd like you can refer to the elements of $S$ as "vectors" to remind people that $(S,F,+,times)$ is a vector space.
$endgroup$
– littleO
Sep 22 '16 at 23:36
4
$begingroup$
The reason is that physicists care about the definition of a vector. In linear algebra, the definition of a vector is irrelevant, what is important is the definition of a vector space.
$endgroup$
– Asaf Karagila♦
Sep 23 '16 at 4:53
1
$begingroup$
@AsafKaragila: Sort of. Actually in modern physics they care that a quantity transforms like a vector. So ultimately it's still defined by behaviour, just that it takes more behaviour than just the vector space axioms.
$endgroup$
– celtschk
Sep 23 '16 at 6:27
98
98
$begingroup$
A vector is simply an element of a vector space.
$endgroup$
– Mariano Suárez-Álvarez
Sep 22 '16 at 18:59
$begingroup$
A vector is simply an element of a vector space.
$endgroup$
– Mariano Suárez-Álvarez
Sep 22 '16 at 18:59
5
5
$begingroup$
Do not think too much about what a vector or a vector space is. It is just an algebraic structure which seems to have extremely interesting properties, which then are applied everywhere in science.
$endgroup$
– user305860
Sep 22 '16 at 19:00
$begingroup$
Do not think too much about what a vector or a vector space is. It is just an algebraic structure which seems to have extremely interesting properties, which then are applied everywhere in science.
$endgroup$
– user305860
Sep 22 '16 at 19:00
13
13
$begingroup$
"a vector space is a collection of vectors that..." You should avoid using the word "vectors" there, and simply say, "A vector space is a set $S$ together with a field $F$ and functions $+$ and $times$ such that the following axioms are satisfied..." Then you have avoided using the word "vector", so you don't need to define it. If you'd like you can refer to the elements of $S$ as "vectors" to remind people that $(S,F,+,times)$ is a vector space.
$endgroup$
– littleO
Sep 22 '16 at 23:36
$begingroup$
"a vector space is a collection of vectors that..." You should avoid using the word "vectors" there, and simply say, "A vector space is a set $S$ together with a field $F$ and functions $+$ and $times$ such that the following axioms are satisfied..." Then you have avoided using the word "vector", so you don't need to define it. If you'd like you can refer to the elements of $S$ as "vectors" to remind people that $(S,F,+,times)$ is a vector space.
$endgroup$
– littleO
Sep 22 '16 at 23:36
4
4
$begingroup$
The reason is that physicists care about the definition of a vector. In linear algebra, the definition of a vector is irrelevant, what is important is the definition of a vector space.
$endgroup$
– Asaf Karagila♦
Sep 23 '16 at 4:53
$begingroup$
The reason is that physicists care about the definition of a vector. In linear algebra, the definition of a vector is irrelevant, what is important is the definition of a vector space.
$endgroup$
– Asaf Karagila♦
Sep 23 '16 at 4:53
1
1
$begingroup$
@AsafKaragila: Sort of. Actually in modern physics they care that a quantity transforms like a vector. So ultimately it's still defined by behaviour, just that it takes more behaviour than just the vector space axioms.
$endgroup$
– celtschk
Sep 23 '16 at 6:27
$begingroup$
@AsafKaragila: Sort of. Actually in modern physics they care that a quantity transforms like a vector. So ultimately it's still defined by behaviour, just that it takes more behaviour than just the vector space axioms.
$endgroup$
– celtschk
Sep 23 '16 at 6:27
|
show 5 more comments
9 Answers
9
active
oldest
votes
$begingroup$
In modern mathematics, there's a tendency to define things in terms of what they do rather than in terms of what they are.
As an example, suppose that I claim that there are objects called "pizkwats" that obey the following laws:
- $forall x. forall y. exists z. x + y = z$
- $exists x. x = 0$
- $forall x. x + 0 = 0 + x = x$
- $forall x. forall y. forall z. (x + y) + z = x + (y + z)$
- $forall x. x + x = 0$
These rules specify what pizkwats do by saying what rules they obey, but they don't say anything about what pizkwats are. We can find all sorts of things that we could call pizkwats. For example, we could imagine that pizkwats are the numbers 0 and 1, with addition being done modulo 2. They could also be bitstrings of length 137, with "addition" meaning "bitwise XOR." Both of these groups of objects obey the rules for what pizkwats do, but neither of them "are" pizkwats.
The advantage of this approach is that we can prove results about pizkwats knowing purely how they behave rather than what they fundamentally are. For example, as a fun exercise, see if you can use the above rules to prove that
$forall x. forall y. x + y = y + x$.
This means that anything that "acts like a pizkwat" must support a commutative addition operator. Similarly, we could prove that
$forall x. forall y. (x + y = 0 rightarrow x = y)$.
The advantage of setting things up this way is that any time we find something that "looks like a pizkwat" in the sense that it obeys the rules given above, we're guaranteed that it must have some other properties, namely, that it's commutative and that every element has its own and unique inverse. We could develop a whole elaborate theory about how pizkwats behave and what pizkwats do purely based on the rules of how they work, and since we specifically never actually said what a pizkwat is, anything that we find that looks like a pizkwat instantly falls into our theory.
In your case, you're asking about what a vector is. In a sense, there is no single thing called "a vector," because a vector is just something that obeys a bunch of rules. But any time you find something that looks like a vector, you immediately get a bunch of interesting facts about it - you can ask questions about spans, about changing basis, etc. - regardless of whether that thing you're looking at is a vector in the classical sense (a list of numbers, or an arrow pointing somewhere) or a vector in a more abstract sense (say, a function acting as a vector in a "vector space" made of functions.)
As a concluding remark, Grant Sanderson of 3blue1brown has an excellent video talking about what vectors are that explores this in more depth.
$endgroup$
14
$begingroup$
+1 for your first sentence which puts succinctly the modern algebraic/axiomatic viewpoint.
$endgroup$
– P Vanchinathan
Sep 23 '16 at 3:48
1
$begingroup$
To add an example: In Tomography, the image is a vector. So yeah, your insides, lungs, heart, muscles are a vector in CT scans!
$endgroup$
– Ander Biguri
Sep 23 '16 at 10:34
5
$begingroup$
This is an excellent explanation. For anyone else coming from a software development background, this is roughly analogous to how typeclasses, traits, and interfaces work!
$endgroup$
– Jules
Sep 23 '16 at 17:53
$begingroup$
These pizkwats are such a great example! If anyone is interested in the proofs of the two statements: pastebin.com/mYJVfRRa
$endgroup$
– Vincent
Sep 23 '16 at 21:24
3
$begingroup$
+1 for inclusion of Grant Sanderson's 3Blue1Brown video. His 'Essence of Linear Algebra' series of videos are fantastic for those who are looking for a quick introduction to the core concepts of Linear Algebra.
$endgroup$
– Perturbative
Sep 25 '16 at 4:44
|
show 2 more comments
$begingroup$
When I was 14, I was introduced to vectors in a freshman physics course (algebra based). We were told that it was a quantity with magnitude and direction. This is stuff like force, momentum, and electric field.
Three years later in precalculus we thought of them as "points," but with arrows emanating from the origin to that point. Just another thing. This was the concept that stuck until I took linear algebra two more years later.
But now in the abstract sense, vectors don't have to be these "arrows." They can be anything we want: functions, numbers, matrices, operators, whatever. When we build vector spaces (linear spaces in other texts), we just call the objects vectors - who cares what they look like? It's a name to an abstract object.
For example, in $mathbb{R}^n$ our vectors are ordered n-tuples. In $mathcal{C}[a,b]$ our vectors are now functions - continuous functions on $[a, b]$ at that. In $L^2(mathbb{R}$) our vectors are those functions for which
$$ int_{mathbb{R}} | f |^2 < infty $$
where the integral is taken in the Lebesgue sense.
Vectors are whatever we take them to be in the appropriate context.
$endgroup$
10
$begingroup$
There are things which are not vectors, and spaces which are not vector spaces. Vector's can't be "anything you want".
$endgroup$
– Wouter
Sep 23 '16 at 9:34
3
$begingroup$
Well, vectors can be anything we want as long as we have (or invent) a way to sum them and multiply them by scalars, and those operations follow a few quite reasonable rules.
$endgroup$
– Pere
Sep 23 '16 at 9:49
10
$begingroup$
@Wouter: To wit, given any thing $x$, I can define a vector space whose only element is $x$, and the result of scalar multiplication and addition is always $x$. So there really is a vector space in which the thing $x$ is a vector.
$endgroup$
– Hurkyl
Sep 23 '16 at 11:16
3
$begingroup$
@Hurkyl On the other hand, there are sets which cannot be turned into vector spaces (finite sets of most sizes).
$endgroup$
– Tobias Kildetoft
Sep 23 '16 at 18:23
2
$begingroup$
Nit: Electric charge is a scalar, not a vector (maybe you mean electric field).
$endgroup$
– kennytm
Sep 24 '16 at 9:31
|
show 1 more comment
$begingroup$
It's an element of a set which endowed with a certain structure, i.e. satisfying
the axioms of a vector space.
$endgroup$
4
$begingroup$
To add to this, the vector space axioms formalize the familiar rules of vector addition and scalar multiplication that work in $mathbb R^2$ and $mathbb R^3$. It's okay in the beginning to visualize an abstract vector space as $mathbb R^3$, as long as you're careful not to confuse visualization with rigorous reasoning.
$endgroup$
– user37208
Sep 23 '16 at 0:32
$begingroup$
That's at most a comment, not an answer.
$endgroup$
– Jannik Pitt
May 19 '17 at 19:24
add a comment |
$begingroup$
This may be disconcerting at first, but the whole point of the abstract notion of vectors is to not tell you precisely what they are. In practice (that is, when using linear algebra in other areas of mathematics and the sciences, and there are a lot of areas that use linear algebra), a vector could be a real or complex valued function, a power series, a translation in Euclidean space, a description of a state of a quantum mechanical system, or something quite different still.
The reason all these diverse things are gathered under the common name of vector, is that for certain type of questions about all these things, a common way of reasoning can be applied; this is what linear algebra is about. In all cases there must be a definite (large) set of vectors (the vector space in which the vectors live), and operations of addition and scalar multiplication of vectors must be defined. What these operations are concretely may vary according to the nature of the vectors. Certain properties are required to hold in order to serve as a foundation for reasoning; these axioms say for instance that there must be a distinguished "zero" vector that is neutral for addition, that addition of vectors is commutative (a good linear algebra course will give you the complete list).
Linear algebra will tell you what facts about vectors, formulated exclusively in terms of the vector space operations, can be deduced purely from those axioms. Some kinds of vectors have more operations defined than just those of linear algebra: for instance power series can be multiplied together (while in general one cannot multiply two vectors), and functions allow talking about taking limits. However, proving statements about such operations will be based on other facts than the axioms of linear algebra, and will require a different kind of reasoning adapted to each case. In contrast, linear algebra focusses on a large body of common properties that can be derived in exactly the same way in all to the examples, because it does not involve at all these additional structures that may be present. It is for that reason that linear algebra speaks of vectors in an abstract manner, and limits its language to the operations of addition and scalar multiplication (and other notions that can be entirely defined in terms of them).
$endgroup$
add a comment |
$begingroup$
You seem to be thinking that a vector is something different depending on the field of study you are working in, but this is not true. The definition of a vector that you learn in linear algebra tells you everything you need to know about what a vector is in any setting. A vector is simply an element of a vector space, period. A vector space being any set that follows the axioms you've been given.
The vector space $mathbb{R}^3$ that you are used to from physics is just one example of a vector space. So, to say that a vector is a column of numbers, or a geometric object with magnitude and direction, is incorrect. These are just specific examples of the many possible vectors that are out there.
I think you are looking for a very specific notion of what a vector is, when instead you should try to reconcile why all of the types of vectors that you are already used to using actually are vectors in the sense of the true definition you've been given in linear algebra.
$endgroup$
2
$begingroup$
To be fair, a vector in computer science really is something that has very little to do with vectors in physics or in abstract algebra.
$endgroup$
– JohannesD
Sep 25 '16 at 20:29
add a comment |
$begingroup$
Just to help and understand the change of concept from physics to linear algebra about vectors, without pretending to be rigorous.
Consider that in physics (Newtonian) you consider an euclidean space, so you can speak in terms of magnitude. In linear algebra we want to be able and define a vector in broader terms, in a reference system that is not necessarily orthogonal, what is called an affine space/subspace.
Infact in affine geometry (which help to visualize) an oriented segment $mathop {AB}limits^ to$ is an ordered couple of points, and a vector corresponds to the ordered $n$-uple of the difference of their coordinates (the translation vector). A vector therefore is a representative of all the segments, oriented in the same direction, which are parallel and have the same "translation" (and not modulus, which is not defined, or better it is not preserved under an affine change of coordinates).
$endgroup$
add a comment |
$begingroup$
Literally, an element, or a point, in a vector space, but to reach direction and magnitude, the vector space requires an inner product. Most every vector space you have seen does. An example vector space with inner product is $mathbb R^3$, which you probably use in physics a lot. In math, "element" and "point" are frequently interchangeable: the word "element" emphasizes the algebraic nature or simply set-based nature of the thing in question, whereas "point" emphasizes a geometric interpretation.
Inner products are not part of the definition of a vector space. A linear algebra course that always works with bases and matrices will not bother to define them since the basis of a finite dimensional space always defines an inner product. A theoretical linear algebra course will not include the inner product in the definition of a vector space, but will probably study them by end of semester.
You are more familiar with the "point" interpretation, it seems. So a vector is just a point. But, as a point, it comes with additional information, since a vector space also has an origin. Therefore each point corresponds to a line segment. The inner product gives the vector a direction from the origin to the point, and the inner product also gives the vector a magnitude. (Even more detail: every inner product automatically defines a norm, and the norm is mostly synonymous with magnitude).
Surely it is easier to say "the direction of a vector" than "the point defines a line segment," but you are right to be confused -- it's a lot of shorthand and skipped details to get from the barebones math of elements and sets, to intuitive geometric quantities of directions and quantities.
Every mathematician and physicist is fluent in this shorthand, and can apply it precisely as needed. You will see a lot of this in your math career, and you should always convince yourself that when steps are skipped, the steps are honest and precise.
$endgroup$
$begingroup$
@GrumpyParsnip fixed. pretty obvious typo, you can edit directly also.
$endgroup$
– djechlin
Sep 25 '16 at 20:50
add a comment |
$begingroup$
The YouTube channel 3Blue1Brown has recently put out an amazing short series on the "Essence of linear algebra". As it so happens, the first chapter is called "Vectors, what even are they?" and is an outstanding explanation, far simpler than any of the answers above: https://www.youtube.com/watch?v=fNk_zzaMoSs
While I highly recommend just watching the video (since vectors are really best understood visually), I'll try to summarize here: vectors are simply lists of numbers--that's really it. They can be used in a geometric sense (similar to the physics sense you're already familiar with on a grid) where each number represents coordinates relative to some "axes" (formally called "basis vectors"). In the most common case with basis vectors $hat{i}$, a 1-unit-long vector pointing directly right along the x-axis (represented as $[1, 0]$), and $hat{j}$, a 1-unit-long vector orthogonal to $hat{j}$ pointing up along the the y-axis (represented as $[0, 1]$), vectors are just coordinates on the plane. So in that case $[1, 1]$ is a vector pointing up and to the right from the origin to coordinate $(1, 1)$.
The video also goes into how vectors can also be seen as geometric transformations of the plane (e.g. squashing, stretching, shearing, or rotating), but that's something you really need to see to understand.
$endgroup$
$begingroup$
Better as a comment containing the link. or an answer with the link to help.
$endgroup$
– 6005
Sep 25 '16 at 6:18
1
$begingroup$
@6005: Good point. Added a short summary of the video.
$endgroup$
– 0x24a537r9
Sep 25 '16 at 6:35
add a comment |
$begingroup$
Your question is a nice example for how mathematics of today works and how certain notions emerged. There is no fancy definition of a vector in mathematics, the content of the meaning what a vector has to be was shifted to other objects: Mathematicians have abstracted certain properties of "objects" that appear in geometry or physics. This fits better the axiomatic requirements of today's mathematics.
Near the end of the 19th century, a "vector" was an ordered pair (A,B) of points in an affine space. This was also called a "fixed vector", where one could imagine (A,B) to be an arrow beginning at point A and ending with its tip at point B. In today's differential geometry one finds some relicts of this situation, when a "vector" is usually given together with its base point to which it is attached.
In mechanics, there appeared so called "line-bond vectors", vectors that were considered equivalent if they differed only by a translation along the line through A and B (if A is unequal to B). "Free vectors" were vectors considered to be equivalent if they differed only by a translation in the affine space. Free vectors can represent translations. Translations can be composed and inverted - they form a group. Translations can be scaled by multiplying with a number.
From these properties emerged what is called a "vector space".
Due to the axiomatic requirements of mathematics, one puts the cart before the horse:
First, one defines - abstractly - a "vector space" (over a field (K,+,0,$cdot$,1) ) to be a group (V,+,0) on which K acts "compatibly" via a homomorphism (of rings with unit) from the field K to the group endomorphisms of V:
(K,+,0,$cdot$,1) $to$ (Hom$_{Grp}$(V,V),+,0,$circ$,id$_V$).
Afterwards, one defines a "vector" to be an element of a vector space.
So the abstracted properties appear in the definition of a vector space, not in the definition of a vector.
The original geometric content of a vector appears only later as a very special case, when a real vector space acts on a (real) affine space as its space of translations, and when these translations are depicted eg. by arrows.
(N.B.: Historically spoken, a vector was not even a pair of points, but could have different meanings, e.g. as a pair of parallel planes in 3-dimensional affine space. It also took some time till the notion of a vector space emerged and till various fields K of scalars or even skew fields were admitted. Generalizing the notion of a K-vector field from a field K to a commutative ring R with unit gives today's notion of an R-module.)
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1937464%2fin-linear-algebra-what-is-a-vector%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
9 Answers
9
active
oldest
votes
9 Answers
9
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
In modern mathematics, there's a tendency to define things in terms of what they do rather than in terms of what they are.
As an example, suppose that I claim that there are objects called "pizkwats" that obey the following laws:
- $forall x. forall y. exists z. x + y = z$
- $exists x. x = 0$
- $forall x. x + 0 = 0 + x = x$
- $forall x. forall y. forall z. (x + y) + z = x + (y + z)$
- $forall x. x + x = 0$
These rules specify what pizkwats do by saying what rules they obey, but they don't say anything about what pizkwats are. We can find all sorts of things that we could call pizkwats. For example, we could imagine that pizkwats are the numbers 0 and 1, with addition being done modulo 2. They could also be bitstrings of length 137, with "addition" meaning "bitwise XOR." Both of these groups of objects obey the rules for what pizkwats do, but neither of them "are" pizkwats.
The advantage of this approach is that we can prove results about pizkwats knowing purely how they behave rather than what they fundamentally are. For example, as a fun exercise, see if you can use the above rules to prove that
$forall x. forall y. x + y = y + x$.
This means that anything that "acts like a pizkwat" must support a commutative addition operator. Similarly, we could prove that
$forall x. forall y. (x + y = 0 rightarrow x = y)$.
The advantage of setting things up this way is that any time we find something that "looks like a pizkwat" in the sense that it obeys the rules given above, we're guaranteed that it must have some other properties, namely, that it's commutative and that every element has its own and unique inverse. We could develop a whole elaborate theory about how pizkwats behave and what pizkwats do purely based on the rules of how they work, and since we specifically never actually said what a pizkwat is, anything that we find that looks like a pizkwat instantly falls into our theory.
In your case, you're asking about what a vector is. In a sense, there is no single thing called "a vector," because a vector is just something that obeys a bunch of rules. But any time you find something that looks like a vector, you immediately get a bunch of interesting facts about it - you can ask questions about spans, about changing basis, etc. - regardless of whether that thing you're looking at is a vector in the classical sense (a list of numbers, or an arrow pointing somewhere) or a vector in a more abstract sense (say, a function acting as a vector in a "vector space" made of functions.)
As a concluding remark, Grant Sanderson of 3blue1brown has an excellent video talking about what vectors are that explores this in more depth.
$endgroup$
14
$begingroup$
+1 for your first sentence which puts succinctly the modern algebraic/axiomatic viewpoint.
$endgroup$
– P Vanchinathan
Sep 23 '16 at 3:48
1
$begingroup$
To add an example: In Tomography, the image is a vector. So yeah, your insides, lungs, heart, muscles are a vector in CT scans!
$endgroup$
– Ander Biguri
Sep 23 '16 at 10:34
5
$begingroup$
This is an excellent explanation. For anyone else coming from a software development background, this is roughly analogous to how typeclasses, traits, and interfaces work!
$endgroup$
– Jules
Sep 23 '16 at 17:53
$begingroup$
These pizkwats are such a great example! If anyone is interested in the proofs of the two statements: pastebin.com/mYJVfRRa
$endgroup$
– Vincent
Sep 23 '16 at 21:24
3
$begingroup$
+1 for inclusion of Grant Sanderson's 3Blue1Brown video. His 'Essence of Linear Algebra' series of videos are fantastic for those who are looking for a quick introduction to the core concepts of Linear Algebra.
$endgroup$
– Perturbative
Sep 25 '16 at 4:44
|
show 2 more comments
$begingroup$
In modern mathematics, there's a tendency to define things in terms of what they do rather than in terms of what they are.
As an example, suppose that I claim that there are objects called "pizkwats" that obey the following laws:
- $forall x. forall y. exists z. x + y = z$
- $exists x. x = 0$
- $forall x. x + 0 = 0 + x = x$
- $forall x. forall y. forall z. (x + y) + z = x + (y + z)$
- $forall x. x + x = 0$
These rules specify what pizkwats do by saying what rules they obey, but they don't say anything about what pizkwats are. We can find all sorts of things that we could call pizkwats. For example, we could imagine that pizkwats are the numbers 0 and 1, with addition being done modulo 2. They could also be bitstrings of length 137, with "addition" meaning "bitwise XOR." Both of these groups of objects obey the rules for what pizkwats do, but neither of them "are" pizkwats.
The advantage of this approach is that we can prove results about pizkwats knowing purely how they behave rather than what they fundamentally are. For example, as a fun exercise, see if you can use the above rules to prove that
$forall x. forall y. x + y = y + x$.
This means that anything that "acts like a pizkwat" must support a commutative addition operator. Similarly, we could prove that
$forall x. forall y. (x + y = 0 rightarrow x = y)$.
The advantage of setting things up this way is that any time we find something that "looks like a pizkwat" in the sense that it obeys the rules given above, we're guaranteed that it must have some other properties, namely, that it's commutative and that every element has its own and unique inverse. We could develop a whole elaborate theory about how pizkwats behave and what pizkwats do purely based on the rules of how they work, and since we specifically never actually said what a pizkwat is, anything that we find that looks like a pizkwat instantly falls into our theory.
In your case, you're asking about what a vector is. In a sense, there is no single thing called "a vector," because a vector is just something that obeys a bunch of rules. But any time you find something that looks like a vector, you immediately get a bunch of interesting facts about it - you can ask questions about spans, about changing basis, etc. - regardless of whether that thing you're looking at is a vector in the classical sense (a list of numbers, or an arrow pointing somewhere) or a vector in a more abstract sense (say, a function acting as a vector in a "vector space" made of functions.)
As a concluding remark, Grant Sanderson of 3blue1brown has an excellent video talking about what vectors are that explores this in more depth.
$endgroup$
14
$begingroup$
+1 for your first sentence which puts succinctly the modern algebraic/axiomatic viewpoint.
$endgroup$
– P Vanchinathan
Sep 23 '16 at 3:48
1
$begingroup$
To add an example: In Tomography, the image is a vector. So yeah, your insides, lungs, heart, muscles are a vector in CT scans!
$endgroup$
– Ander Biguri
Sep 23 '16 at 10:34
5
$begingroup$
This is an excellent explanation. For anyone else coming from a software development background, this is roughly analogous to how typeclasses, traits, and interfaces work!
$endgroup$
– Jules
Sep 23 '16 at 17:53
$begingroup$
These pizkwats are such a great example! If anyone is interested in the proofs of the two statements: pastebin.com/mYJVfRRa
$endgroup$
– Vincent
Sep 23 '16 at 21:24
3
$begingroup$
+1 for inclusion of Grant Sanderson's 3Blue1Brown video. His 'Essence of Linear Algebra' series of videos are fantastic for those who are looking for a quick introduction to the core concepts of Linear Algebra.
$endgroup$
– Perturbative
Sep 25 '16 at 4:44
|
show 2 more comments
$begingroup$
In modern mathematics, there's a tendency to define things in terms of what they do rather than in terms of what they are.
As an example, suppose that I claim that there are objects called "pizkwats" that obey the following laws:
- $forall x. forall y. exists z. x + y = z$
- $exists x. x = 0$
- $forall x. x + 0 = 0 + x = x$
- $forall x. forall y. forall z. (x + y) + z = x + (y + z)$
- $forall x. x + x = 0$
These rules specify what pizkwats do by saying what rules they obey, but they don't say anything about what pizkwats are. We can find all sorts of things that we could call pizkwats. For example, we could imagine that pizkwats are the numbers 0 and 1, with addition being done modulo 2. They could also be bitstrings of length 137, with "addition" meaning "bitwise XOR." Both of these groups of objects obey the rules for what pizkwats do, but neither of them "are" pizkwats.
The advantage of this approach is that we can prove results about pizkwats knowing purely how they behave rather than what they fundamentally are. For example, as a fun exercise, see if you can use the above rules to prove that
$forall x. forall y. x + y = y + x$.
This means that anything that "acts like a pizkwat" must support a commutative addition operator. Similarly, we could prove that
$forall x. forall y. (x + y = 0 rightarrow x = y)$.
The advantage of setting things up this way is that any time we find something that "looks like a pizkwat" in the sense that it obeys the rules given above, we're guaranteed that it must have some other properties, namely, that it's commutative and that every element has its own and unique inverse. We could develop a whole elaborate theory about how pizkwats behave and what pizkwats do purely based on the rules of how they work, and since we specifically never actually said what a pizkwat is, anything that we find that looks like a pizkwat instantly falls into our theory.
In your case, you're asking about what a vector is. In a sense, there is no single thing called "a vector," because a vector is just something that obeys a bunch of rules. But any time you find something that looks like a vector, you immediately get a bunch of interesting facts about it - you can ask questions about spans, about changing basis, etc. - regardless of whether that thing you're looking at is a vector in the classical sense (a list of numbers, or an arrow pointing somewhere) or a vector in a more abstract sense (say, a function acting as a vector in a "vector space" made of functions.)
As a concluding remark, Grant Sanderson of 3blue1brown has an excellent video talking about what vectors are that explores this in more depth.
$endgroup$
In modern mathematics, there's a tendency to define things in terms of what they do rather than in terms of what they are.
As an example, suppose that I claim that there are objects called "pizkwats" that obey the following laws:
- $forall x. forall y. exists z. x + y = z$
- $exists x. x = 0$
- $forall x. x + 0 = 0 + x = x$
- $forall x. forall y. forall z. (x + y) + z = x + (y + z)$
- $forall x. x + x = 0$
These rules specify what pizkwats do by saying what rules they obey, but they don't say anything about what pizkwats are. We can find all sorts of things that we could call pizkwats. For example, we could imagine that pizkwats are the numbers 0 and 1, with addition being done modulo 2. They could also be bitstrings of length 137, with "addition" meaning "bitwise XOR." Both of these groups of objects obey the rules for what pizkwats do, but neither of them "are" pizkwats.
The advantage of this approach is that we can prove results about pizkwats knowing purely how they behave rather than what they fundamentally are. For example, as a fun exercise, see if you can use the above rules to prove that
$forall x. forall y. x + y = y + x$.
This means that anything that "acts like a pizkwat" must support a commutative addition operator. Similarly, we could prove that
$forall x. forall y. (x + y = 0 rightarrow x = y)$.
The advantage of setting things up this way is that any time we find something that "looks like a pizkwat" in the sense that it obeys the rules given above, we're guaranteed that it must have some other properties, namely, that it's commutative and that every element has its own and unique inverse. We could develop a whole elaborate theory about how pizkwats behave and what pizkwats do purely based on the rules of how they work, and since we specifically never actually said what a pizkwat is, anything that we find that looks like a pizkwat instantly falls into our theory.
In your case, you're asking about what a vector is. In a sense, there is no single thing called "a vector," because a vector is just something that obeys a bunch of rules. But any time you find something that looks like a vector, you immediately get a bunch of interesting facts about it - you can ask questions about spans, about changing basis, etc. - regardless of whether that thing you're looking at is a vector in the classical sense (a list of numbers, or an arrow pointing somewhere) or a vector in a more abstract sense (say, a function acting as a vector in a "vector space" made of functions.)
As a concluding remark, Grant Sanderson of 3blue1brown has an excellent video talking about what vectors are that explores this in more depth.
edited Sep 24 '16 at 23:22
answered Sep 22 '16 at 23:29
templatetypedeftemplatetypedef
4,56622457
4,56622457
14
$begingroup$
+1 for your first sentence which puts succinctly the modern algebraic/axiomatic viewpoint.
$endgroup$
– P Vanchinathan
Sep 23 '16 at 3:48
1
$begingroup$
To add an example: In Tomography, the image is a vector. So yeah, your insides, lungs, heart, muscles are a vector in CT scans!
$endgroup$
– Ander Biguri
Sep 23 '16 at 10:34
5
$begingroup$
This is an excellent explanation. For anyone else coming from a software development background, this is roughly analogous to how typeclasses, traits, and interfaces work!
$endgroup$
– Jules
Sep 23 '16 at 17:53
$begingroup$
These pizkwats are such a great example! If anyone is interested in the proofs of the two statements: pastebin.com/mYJVfRRa
$endgroup$
– Vincent
Sep 23 '16 at 21:24
3
$begingroup$
+1 for inclusion of Grant Sanderson's 3Blue1Brown video. His 'Essence of Linear Algebra' series of videos are fantastic for those who are looking for a quick introduction to the core concepts of Linear Algebra.
$endgroup$
– Perturbative
Sep 25 '16 at 4:44
|
show 2 more comments
14
$begingroup$
+1 for your first sentence which puts succinctly the modern algebraic/axiomatic viewpoint.
$endgroup$
– P Vanchinathan
Sep 23 '16 at 3:48
1
$begingroup$
To add an example: In Tomography, the image is a vector. So yeah, your insides, lungs, heart, muscles are a vector in CT scans!
$endgroup$
– Ander Biguri
Sep 23 '16 at 10:34
5
$begingroup$
This is an excellent explanation. For anyone else coming from a software development background, this is roughly analogous to how typeclasses, traits, and interfaces work!
$endgroup$
– Jules
Sep 23 '16 at 17:53
$begingroup$
These pizkwats are such a great example! If anyone is interested in the proofs of the two statements: pastebin.com/mYJVfRRa
$endgroup$
– Vincent
Sep 23 '16 at 21:24
3
$begingroup$
+1 for inclusion of Grant Sanderson's 3Blue1Brown video. His 'Essence of Linear Algebra' series of videos are fantastic for those who are looking for a quick introduction to the core concepts of Linear Algebra.
$endgroup$
– Perturbative
Sep 25 '16 at 4:44
14
14
$begingroup$
+1 for your first sentence which puts succinctly the modern algebraic/axiomatic viewpoint.
$endgroup$
– P Vanchinathan
Sep 23 '16 at 3:48
$begingroup$
+1 for your first sentence which puts succinctly the modern algebraic/axiomatic viewpoint.
$endgroup$
– P Vanchinathan
Sep 23 '16 at 3:48
1
1
$begingroup$
To add an example: In Tomography, the image is a vector. So yeah, your insides, lungs, heart, muscles are a vector in CT scans!
$endgroup$
– Ander Biguri
Sep 23 '16 at 10:34
$begingroup$
To add an example: In Tomography, the image is a vector. So yeah, your insides, lungs, heart, muscles are a vector in CT scans!
$endgroup$
– Ander Biguri
Sep 23 '16 at 10:34
5
5
$begingroup$
This is an excellent explanation. For anyone else coming from a software development background, this is roughly analogous to how typeclasses, traits, and interfaces work!
$endgroup$
– Jules
Sep 23 '16 at 17:53
$begingroup$
This is an excellent explanation. For anyone else coming from a software development background, this is roughly analogous to how typeclasses, traits, and interfaces work!
$endgroup$
– Jules
Sep 23 '16 at 17:53
$begingroup$
These pizkwats are such a great example! If anyone is interested in the proofs of the two statements: pastebin.com/mYJVfRRa
$endgroup$
– Vincent
Sep 23 '16 at 21:24
$begingroup$
These pizkwats are such a great example! If anyone is interested in the proofs of the two statements: pastebin.com/mYJVfRRa
$endgroup$
– Vincent
Sep 23 '16 at 21:24
3
3
$begingroup$
+1 for inclusion of Grant Sanderson's 3Blue1Brown video. His 'Essence of Linear Algebra' series of videos are fantastic for those who are looking for a quick introduction to the core concepts of Linear Algebra.
$endgroup$
– Perturbative
Sep 25 '16 at 4:44
$begingroup$
+1 for inclusion of Grant Sanderson's 3Blue1Brown video. His 'Essence of Linear Algebra' series of videos are fantastic for those who are looking for a quick introduction to the core concepts of Linear Algebra.
$endgroup$
– Perturbative
Sep 25 '16 at 4:44
|
show 2 more comments
$begingroup$
When I was 14, I was introduced to vectors in a freshman physics course (algebra based). We were told that it was a quantity with magnitude and direction. This is stuff like force, momentum, and electric field.
Three years later in precalculus we thought of them as "points," but with arrows emanating from the origin to that point. Just another thing. This was the concept that stuck until I took linear algebra two more years later.
But now in the abstract sense, vectors don't have to be these "arrows." They can be anything we want: functions, numbers, matrices, operators, whatever. When we build vector spaces (linear spaces in other texts), we just call the objects vectors - who cares what they look like? It's a name to an abstract object.
For example, in $mathbb{R}^n$ our vectors are ordered n-tuples. In $mathcal{C}[a,b]$ our vectors are now functions - continuous functions on $[a, b]$ at that. In $L^2(mathbb{R}$) our vectors are those functions for which
$$ int_{mathbb{R}} | f |^2 < infty $$
where the integral is taken in the Lebesgue sense.
Vectors are whatever we take them to be in the appropriate context.
$endgroup$
10
$begingroup$
There are things which are not vectors, and spaces which are not vector spaces. Vector's can't be "anything you want".
$endgroup$
– Wouter
Sep 23 '16 at 9:34
3
$begingroup$
Well, vectors can be anything we want as long as we have (or invent) a way to sum them and multiply them by scalars, and those operations follow a few quite reasonable rules.
$endgroup$
– Pere
Sep 23 '16 at 9:49
10
$begingroup$
@Wouter: To wit, given any thing $x$, I can define a vector space whose only element is $x$, and the result of scalar multiplication and addition is always $x$. So there really is a vector space in which the thing $x$ is a vector.
$endgroup$
– Hurkyl
Sep 23 '16 at 11:16
3
$begingroup$
@Hurkyl On the other hand, there are sets which cannot be turned into vector spaces (finite sets of most sizes).
$endgroup$
– Tobias Kildetoft
Sep 23 '16 at 18:23
2
$begingroup$
Nit: Electric charge is a scalar, not a vector (maybe you mean electric field).
$endgroup$
– kennytm
Sep 24 '16 at 9:31
|
show 1 more comment
$begingroup$
When I was 14, I was introduced to vectors in a freshman physics course (algebra based). We were told that it was a quantity with magnitude and direction. This is stuff like force, momentum, and electric field.
Three years later in precalculus we thought of them as "points," but with arrows emanating from the origin to that point. Just another thing. This was the concept that stuck until I took linear algebra two more years later.
But now in the abstract sense, vectors don't have to be these "arrows." They can be anything we want: functions, numbers, matrices, operators, whatever. When we build vector spaces (linear spaces in other texts), we just call the objects vectors - who cares what they look like? It's a name to an abstract object.
For example, in $mathbb{R}^n$ our vectors are ordered n-tuples. In $mathcal{C}[a,b]$ our vectors are now functions - continuous functions on $[a, b]$ at that. In $L^2(mathbb{R}$) our vectors are those functions for which
$$ int_{mathbb{R}} | f |^2 < infty $$
where the integral is taken in the Lebesgue sense.
Vectors are whatever we take them to be in the appropriate context.
$endgroup$
10
$begingroup$
There are things which are not vectors, and spaces which are not vector spaces. Vector's can't be "anything you want".
$endgroup$
– Wouter
Sep 23 '16 at 9:34
3
$begingroup$
Well, vectors can be anything we want as long as we have (or invent) a way to sum them and multiply them by scalars, and those operations follow a few quite reasonable rules.
$endgroup$
– Pere
Sep 23 '16 at 9:49
10
$begingroup$
@Wouter: To wit, given any thing $x$, I can define a vector space whose only element is $x$, and the result of scalar multiplication and addition is always $x$. So there really is a vector space in which the thing $x$ is a vector.
$endgroup$
– Hurkyl
Sep 23 '16 at 11:16
3
$begingroup$
@Hurkyl On the other hand, there are sets which cannot be turned into vector spaces (finite sets of most sizes).
$endgroup$
– Tobias Kildetoft
Sep 23 '16 at 18:23
2
$begingroup$
Nit: Electric charge is a scalar, not a vector (maybe you mean electric field).
$endgroup$
– kennytm
Sep 24 '16 at 9:31
|
show 1 more comment
$begingroup$
When I was 14, I was introduced to vectors in a freshman physics course (algebra based). We were told that it was a quantity with magnitude and direction. This is stuff like force, momentum, and electric field.
Three years later in precalculus we thought of them as "points," but with arrows emanating from the origin to that point. Just another thing. This was the concept that stuck until I took linear algebra two more years later.
But now in the abstract sense, vectors don't have to be these "arrows." They can be anything we want: functions, numbers, matrices, operators, whatever. When we build vector spaces (linear spaces in other texts), we just call the objects vectors - who cares what they look like? It's a name to an abstract object.
For example, in $mathbb{R}^n$ our vectors are ordered n-tuples. In $mathcal{C}[a,b]$ our vectors are now functions - continuous functions on $[a, b]$ at that. In $L^2(mathbb{R}$) our vectors are those functions for which
$$ int_{mathbb{R}} | f |^2 < infty $$
where the integral is taken in the Lebesgue sense.
Vectors are whatever we take them to be in the appropriate context.
$endgroup$
When I was 14, I was introduced to vectors in a freshman physics course (algebra based). We were told that it was a quantity with magnitude and direction. This is stuff like force, momentum, and electric field.
Three years later in precalculus we thought of them as "points," but with arrows emanating from the origin to that point. Just another thing. This was the concept that stuck until I took linear algebra two more years later.
But now in the abstract sense, vectors don't have to be these "arrows." They can be anything we want: functions, numbers, matrices, operators, whatever. When we build vector spaces (linear spaces in other texts), we just call the objects vectors - who cares what they look like? It's a name to an abstract object.
For example, in $mathbb{R}^n$ our vectors are ordered n-tuples. In $mathcal{C}[a,b]$ our vectors are now functions - continuous functions on $[a, b]$ at that. In $L^2(mathbb{R}$) our vectors are those functions for which
$$ int_{mathbb{R}} | f |^2 < infty $$
where the integral is taken in the Lebesgue sense.
Vectors are whatever we take them to be in the appropriate context.
edited Sep 25 '16 at 15:35
answered Sep 22 '16 at 19:11
Sean RobersonSean Roberson
6,39031327
6,39031327
10
$begingroup$
There are things which are not vectors, and spaces which are not vector spaces. Vector's can't be "anything you want".
$endgroup$
– Wouter
Sep 23 '16 at 9:34
3
$begingroup$
Well, vectors can be anything we want as long as we have (or invent) a way to sum them and multiply them by scalars, and those operations follow a few quite reasonable rules.
$endgroup$
– Pere
Sep 23 '16 at 9:49
10
$begingroup$
@Wouter: To wit, given any thing $x$, I can define a vector space whose only element is $x$, and the result of scalar multiplication and addition is always $x$. So there really is a vector space in which the thing $x$ is a vector.
$endgroup$
– Hurkyl
Sep 23 '16 at 11:16
3
$begingroup$
@Hurkyl On the other hand, there are sets which cannot be turned into vector spaces (finite sets of most sizes).
$endgroup$
– Tobias Kildetoft
Sep 23 '16 at 18:23
2
$begingroup$
Nit: Electric charge is a scalar, not a vector (maybe you mean electric field).
$endgroup$
– kennytm
Sep 24 '16 at 9:31
|
show 1 more comment
10
$begingroup$
There are things which are not vectors, and spaces which are not vector spaces. Vector's can't be "anything you want".
$endgroup$
– Wouter
Sep 23 '16 at 9:34
3
$begingroup$
Well, vectors can be anything we want as long as we have (or invent) a way to sum them and multiply them by scalars, and those operations follow a few quite reasonable rules.
$endgroup$
– Pere
Sep 23 '16 at 9:49
10
$begingroup$
@Wouter: To wit, given any thing $x$, I can define a vector space whose only element is $x$, and the result of scalar multiplication and addition is always $x$. So there really is a vector space in which the thing $x$ is a vector.
$endgroup$
– Hurkyl
Sep 23 '16 at 11:16
3
$begingroup$
@Hurkyl On the other hand, there are sets which cannot be turned into vector spaces (finite sets of most sizes).
$endgroup$
– Tobias Kildetoft
Sep 23 '16 at 18:23
2
$begingroup$
Nit: Electric charge is a scalar, not a vector (maybe you mean electric field).
$endgroup$
– kennytm
Sep 24 '16 at 9:31
10
10
$begingroup$
There are things which are not vectors, and spaces which are not vector spaces. Vector's can't be "anything you want".
$endgroup$
– Wouter
Sep 23 '16 at 9:34
$begingroup$
There are things which are not vectors, and spaces which are not vector spaces. Vector's can't be "anything you want".
$endgroup$
– Wouter
Sep 23 '16 at 9:34
3
3
$begingroup$
Well, vectors can be anything we want as long as we have (or invent) a way to sum them and multiply them by scalars, and those operations follow a few quite reasonable rules.
$endgroup$
– Pere
Sep 23 '16 at 9:49
$begingroup$
Well, vectors can be anything we want as long as we have (or invent) a way to sum them and multiply them by scalars, and those operations follow a few quite reasonable rules.
$endgroup$
– Pere
Sep 23 '16 at 9:49
10
10
$begingroup$
@Wouter: To wit, given any thing $x$, I can define a vector space whose only element is $x$, and the result of scalar multiplication and addition is always $x$. So there really is a vector space in which the thing $x$ is a vector.
$endgroup$
– Hurkyl
Sep 23 '16 at 11:16
$begingroup$
@Wouter: To wit, given any thing $x$, I can define a vector space whose only element is $x$, and the result of scalar multiplication and addition is always $x$. So there really is a vector space in which the thing $x$ is a vector.
$endgroup$
– Hurkyl
Sep 23 '16 at 11:16
3
3
$begingroup$
@Hurkyl On the other hand, there are sets which cannot be turned into vector spaces (finite sets of most sizes).
$endgroup$
– Tobias Kildetoft
Sep 23 '16 at 18:23
$begingroup$
@Hurkyl On the other hand, there are sets which cannot be turned into vector spaces (finite sets of most sizes).
$endgroup$
– Tobias Kildetoft
Sep 23 '16 at 18:23
2
2
$begingroup$
Nit: Electric charge is a scalar, not a vector (maybe you mean electric field).
$endgroup$
– kennytm
Sep 24 '16 at 9:31
$begingroup$
Nit: Electric charge is a scalar, not a vector (maybe you mean electric field).
$endgroup$
– kennytm
Sep 24 '16 at 9:31
|
show 1 more comment
$begingroup$
It's an element of a set which endowed with a certain structure, i.e. satisfying
the axioms of a vector space.
$endgroup$
4
$begingroup$
To add to this, the vector space axioms formalize the familiar rules of vector addition and scalar multiplication that work in $mathbb R^2$ and $mathbb R^3$. It's okay in the beginning to visualize an abstract vector space as $mathbb R^3$, as long as you're careful not to confuse visualization with rigorous reasoning.
$endgroup$
– user37208
Sep 23 '16 at 0:32
$begingroup$
That's at most a comment, not an answer.
$endgroup$
– Jannik Pitt
May 19 '17 at 19:24
add a comment |
$begingroup$
It's an element of a set which endowed with a certain structure, i.e. satisfying
the axioms of a vector space.
$endgroup$
4
$begingroup$
To add to this, the vector space axioms formalize the familiar rules of vector addition and scalar multiplication that work in $mathbb R^2$ and $mathbb R^3$. It's okay in the beginning to visualize an abstract vector space as $mathbb R^3$, as long as you're careful not to confuse visualization with rigorous reasoning.
$endgroup$
– user37208
Sep 23 '16 at 0:32
$begingroup$
That's at most a comment, not an answer.
$endgroup$
– Jannik Pitt
May 19 '17 at 19:24
add a comment |
$begingroup$
It's an element of a set which endowed with a certain structure, i.e. satisfying
the axioms of a vector space.
$endgroup$
It's an element of a set which endowed with a certain structure, i.e. satisfying
the axioms of a vector space.
answered Sep 22 '16 at 19:05
qbertqbert
22k32460
22k32460
4
$begingroup$
To add to this, the vector space axioms formalize the familiar rules of vector addition and scalar multiplication that work in $mathbb R^2$ and $mathbb R^3$. It's okay in the beginning to visualize an abstract vector space as $mathbb R^3$, as long as you're careful not to confuse visualization with rigorous reasoning.
$endgroup$
– user37208
Sep 23 '16 at 0:32
$begingroup$
That's at most a comment, not an answer.
$endgroup$
– Jannik Pitt
May 19 '17 at 19:24
add a comment |
4
$begingroup$
To add to this, the vector space axioms formalize the familiar rules of vector addition and scalar multiplication that work in $mathbb R^2$ and $mathbb R^3$. It's okay in the beginning to visualize an abstract vector space as $mathbb R^3$, as long as you're careful not to confuse visualization with rigorous reasoning.
$endgroup$
– user37208
Sep 23 '16 at 0:32
$begingroup$
That's at most a comment, not an answer.
$endgroup$
– Jannik Pitt
May 19 '17 at 19:24
4
4
$begingroup$
To add to this, the vector space axioms formalize the familiar rules of vector addition and scalar multiplication that work in $mathbb R^2$ and $mathbb R^3$. It's okay in the beginning to visualize an abstract vector space as $mathbb R^3$, as long as you're careful not to confuse visualization with rigorous reasoning.
$endgroup$
– user37208
Sep 23 '16 at 0:32
$begingroup$
To add to this, the vector space axioms formalize the familiar rules of vector addition and scalar multiplication that work in $mathbb R^2$ and $mathbb R^3$. It's okay in the beginning to visualize an abstract vector space as $mathbb R^3$, as long as you're careful not to confuse visualization with rigorous reasoning.
$endgroup$
– user37208
Sep 23 '16 at 0:32
$begingroup$
That's at most a comment, not an answer.
$endgroup$
– Jannik Pitt
May 19 '17 at 19:24
$begingroup$
That's at most a comment, not an answer.
$endgroup$
– Jannik Pitt
May 19 '17 at 19:24
add a comment |
$begingroup$
This may be disconcerting at first, but the whole point of the abstract notion of vectors is to not tell you precisely what they are. In practice (that is, when using linear algebra in other areas of mathematics and the sciences, and there are a lot of areas that use linear algebra), a vector could be a real or complex valued function, a power series, a translation in Euclidean space, a description of a state of a quantum mechanical system, or something quite different still.
The reason all these diverse things are gathered under the common name of vector, is that for certain type of questions about all these things, a common way of reasoning can be applied; this is what linear algebra is about. In all cases there must be a definite (large) set of vectors (the vector space in which the vectors live), and operations of addition and scalar multiplication of vectors must be defined. What these operations are concretely may vary according to the nature of the vectors. Certain properties are required to hold in order to serve as a foundation for reasoning; these axioms say for instance that there must be a distinguished "zero" vector that is neutral for addition, that addition of vectors is commutative (a good linear algebra course will give you the complete list).
Linear algebra will tell you what facts about vectors, formulated exclusively in terms of the vector space operations, can be deduced purely from those axioms. Some kinds of vectors have more operations defined than just those of linear algebra: for instance power series can be multiplied together (while in general one cannot multiply two vectors), and functions allow talking about taking limits. However, proving statements about such operations will be based on other facts than the axioms of linear algebra, and will require a different kind of reasoning adapted to each case. In contrast, linear algebra focusses on a large body of common properties that can be derived in exactly the same way in all to the examples, because it does not involve at all these additional structures that may be present. It is for that reason that linear algebra speaks of vectors in an abstract manner, and limits its language to the operations of addition and scalar multiplication (and other notions that can be entirely defined in terms of them).
$endgroup$
add a comment |
$begingroup$
This may be disconcerting at first, but the whole point of the abstract notion of vectors is to not tell you precisely what they are. In practice (that is, when using linear algebra in other areas of mathematics and the sciences, and there are a lot of areas that use linear algebra), a vector could be a real or complex valued function, a power series, a translation in Euclidean space, a description of a state of a quantum mechanical system, or something quite different still.
The reason all these diverse things are gathered under the common name of vector, is that for certain type of questions about all these things, a common way of reasoning can be applied; this is what linear algebra is about. In all cases there must be a definite (large) set of vectors (the vector space in which the vectors live), and operations of addition and scalar multiplication of vectors must be defined. What these operations are concretely may vary according to the nature of the vectors. Certain properties are required to hold in order to serve as a foundation for reasoning; these axioms say for instance that there must be a distinguished "zero" vector that is neutral for addition, that addition of vectors is commutative (a good linear algebra course will give you the complete list).
Linear algebra will tell you what facts about vectors, formulated exclusively in terms of the vector space operations, can be deduced purely from those axioms. Some kinds of vectors have more operations defined than just those of linear algebra: for instance power series can be multiplied together (while in general one cannot multiply two vectors), and functions allow talking about taking limits. However, proving statements about such operations will be based on other facts than the axioms of linear algebra, and will require a different kind of reasoning adapted to each case. In contrast, linear algebra focusses on a large body of common properties that can be derived in exactly the same way in all to the examples, because it does not involve at all these additional structures that may be present. It is for that reason that linear algebra speaks of vectors in an abstract manner, and limits its language to the operations of addition and scalar multiplication (and other notions that can be entirely defined in terms of them).
$endgroup$
add a comment |
$begingroup$
This may be disconcerting at first, but the whole point of the abstract notion of vectors is to not tell you precisely what they are. In practice (that is, when using linear algebra in other areas of mathematics and the sciences, and there are a lot of areas that use linear algebra), a vector could be a real or complex valued function, a power series, a translation in Euclidean space, a description of a state of a quantum mechanical system, or something quite different still.
The reason all these diverse things are gathered under the common name of vector, is that for certain type of questions about all these things, a common way of reasoning can be applied; this is what linear algebra is about. In all cases there must be a definite (large) set of vectors (the vector space in which the vectors live), and operations of addition and scalar multiplication of vectors must be defined. What these operations are concretely may vary according to the nature of the vectors. Certain properties are required to hold in order to serve as a foundation for reasoning; these axioms say for instance that there must be a distinguished "zero" vector that is neutral for addition, that addition of vectors is commutative (a good linear algebra course will give you the complete list).
Linear algebra will tell you what facts about vectors, formulated exclusively in terms of the vector space operations, can be deduced purely from those axioms. Some kinds of vectors have more operations defined than just those of linear algebra: for instance power series can be multiplied together (while in general one cannot multiply two vectors), and functions allow talking about taking limits. However, proving statements about such operations will be based on other facts than the axioms of linear algebra, and will require a different kind of reasoning adapted to each case. In contrast, linear algebra focusses on a large body of common properties that can be derived in exactly the same way in all to the examples, because it does not involve at all these additional structures that may be present. It is for that reason that linear algebra speaks of vectors in an abstract manner, and limits its language to the operations of addition and scalar multiplication (and other notions that can be entirely defined in terms of them).
$endgroup$
This may be disconcerting at first, but the whole point of the abstract notion of vectors is to not tell you precisely what they are. In practice (that is, when using linear algebra in other areas of mathematics and the sciences, and there are a lot of areas that use linear algebra), a vector could be a real or complex valued function, a power series, a translation in Euclidean space, a description of a state of a quantum mechanical system, or something quite different still.
The reason all these diverse things are gathered under the common name of vector, is that for certain type of questions about all these things, a common way of reasoning can be applied; this is what linear algebra is about. In all cases there must be a definite (large) set of vectors (the vector space in which the vectors live), and operations of addition and scalar multiplication of vectors must be defined. What these operations are concretely may vary according to the nature of the vectors. Certain properties are required to hold in order to serve as a foundation for reasoning; these axioms say for instance that there must be a distinguished "zero" vector that is neutral for addition, that addition of vectors is commutative (a good linear algebra course will give you the complete list).
Linear algebra will tell you what facts about vectors, formulated exclusively in terms of the vector space operations, can be deduced purely from those axioms. Some kinds of vectors have more operations defined than just those of linear algebra: for instance power series can be multiplied together (while in general one cannot multiply two vectors), and functions allow talking about taking limits. However, proving statements about such operations will be based on other facts than the axioms of linear algebra, and will require a different kind of reasoning adapted to each case. In contrast, linear algebra focusses on a large body of common properties that can be derived in exactly the same way in all to the examples, because it does not involve at all these additional structures that may be present. It is for that reason that linear algebra speaks of vectors in an abstract manner, and limits its language to the operations of addition and scalar multiplication (and other notions that can be entirely defined in terms of them).
answered Sep 23 '16 at 3:43
Marc van LeeuwenMarc van Leeuwen
86.6k5106220
86.6k5106220
add a comment |
add a comment |
$begingroup$
You seem to be thinking that a vector is something different depending on the field of study you are working in, but this is not true. The definition of a vector that you learn in linear algebra tells you everything you need to know about what a vector is in any setting. A vector is simply an element of a vector space, period. A vector space being any set that follows the axioms you've been given.
The vector space $mathbb{R}^3$ that you are used to from physics is just one example of a vector space. So, to say that a vector is a column of numbers, or a geometric object with magnitude and direction, is incorrect. These are just specific examples of the many possible vectors that are out there.
I think you are looking for a very specific notion of what a vector is, when instead you should try to reconcile why all of the types of vectors that you are already used to using actually are vectors in the sense of the true definition you've been given in linear algebra.
$endgroup$
2
$begingroup$
To be fair, a vector in computer science really is something that has very little to do with vectors in physics or in abstract algebra.
$endgroup$
– JohannesD
Sep 25 '16 at 20:29
add a comment |
$begingroup$
You seem to be thinking that a vector is something different depending on the field of study you are working in, but this is not true. The definition of a vector that you learn in linear algebra tells you everything you need to know about what a vector is in any setting. A vector is simply an element of a vector space, period. A vector space being any set that follows the axioms you've been given.
The vector space $mathbb{R}^3$ that you are used to from physics is just one example of a vector space. So, to say that a vector is a column of numbers, or a geometric object with magnitude and direction, is incorrect. These are just specific examples of the many possible vectors that are out there.
I think you are looking for a very specific notion of what a vector is, when instead you should try to reconcile why all of the types of vectors that you are already used to using actually are vectors in the sense of the true definition you've been given in linear algebra.
$endgroup$
2
$begingroup$
To be fair, a vector in computer science really is something that has very little to do with vectors in physics or in abstract algebra.
$endgroup$
– JohannesD
Sep 25 '16 at 20:29
add a comment |
$begingroup$
You seem to be thinking that a vector is something different depending on the field of study you are working in, but this is not true. The definition of a vector that you learn in linear algebra tells you everything you need to know about what a vector is in any setting. A vector is simply an element of a vector space, period. A vector space being any set that follows the axioms you've been given.
The vector space $mathbb{R}^3$ that you are used to from physics is just one example of a vector space. So, to say that a vector is a column of numbers, or a geometric object with magnitude and direction, is incorrect. These are just specific examples of the many possible vectors that are out there.
I think you are looking for a very specific notion of what a vector is, when instead you should try to reconcile why all of the types of vectors that you are already used to using actually are vectors in the sense of the true definition you've been given in linear algebra.
$endgroup$
You seem to be thinking that a vector is something different depending on the field of study you are working in, but this is not true. The definition of a vector that you learn in linear algebra tells you everything you need to know about what a vector is in any setting. A vector is simply an element of a vector space, period. A vector space being any set that follows the axioms you've been given.
The vector space $mathbb{R}^3$ that you are used to from physics is just one example of a vector space. So, to say that a vector is a column of numbers, or a geometric object with magnitude and direction, is incorrect. These are just specific examples of the many possible vectors that are out there.
I think you are looking for a very specific notion of what a vector is, when instead you should try to reconcile why all of the types of vectors that you are already used to using actually are vectors in the sense of the true definition you've been given in linear algebra.
answered Sep 23 '16 at 1:22
wgrenardwgrenard
3,1982818
3,1982818
2
$begingroup$
To be fair, a vector in computer science really is something that has very little to do with vectors in physics or in abstract algebra.
$endgroup$
– JohannesD
Sep 25 '16 at 20:29
add a comment |
2
$begingroup$
To be fair, a vector in computer science really is something that has very little to do with vectors in physics or in abstract algebra.
$endgroup$
– JohannesD
Sep 25 '16 at 20:29
2
2
$begingroup$
To be fair, a vector in computer science really is something that has very little to do with vectors in physics or in abstract algebra.
$endgroup$
– JohannesD
Sep 25 '16 at 20:29
$begingroup$
To be fair, a vector in computer science really is something that has very little to do with vectors in physics or in abstract algebra.
$endgroup$
– JohannesD
Sep 25 '16 at 20:29
add a comment |
$begingroup$
Just to help and understand the change of concept from physics to linear algebra about vectors, without pretending to be rigorous.
Consider that in physics (Newtonian) you consider an euclidean space, so you can speak in terms of magnitude. In linear algebra we want to be able and define a vector in broader terms, in a reference system that is not necessarily orthogonal, what is called an affine space/subspace.
Infact in affine geometry (which help to visualize) an oriented segment $mathop {AB}limits^ to$ is an ordered couple of points, and a vector corresponds to the ordered $n$-uple of the difference of their coordinates (the translation vector). A vector therefore is a representative of all the segments, oriented in the same direction, which are parallel and have the same "translation" (and not modulus, which is not defined, or better it is not preserved under an affine change of coordinates).
$endgroup$
add a comment |
$begingroup$
Just to help and understand the change of concept from physics to linear algebra about vectors, without pretending to be rigorous.
Consider that in physics (Newtonian) you consider an euclidean space, so you can speak in terms of magnitude. In linear algebra we want to be able and define a vector in broader terms, in a reference system that is not necessarily orthogonal, what is called an affine space/subspace.
Infact in affine geometry (which help to visualize) an oriented segment $mathop {AB}limits^ to$ is an ordered couple of points, and a vector corresponds to the ordered $n$-uple of the difference of their coordinates (the translation vector). A vector therefore is a representative of all the segments, oriented in the same direction, which are parallel and have the same "translation" (and not modulus, which is not defined, or better it is not preserved under an affine change of coordinates).
$endgroup$
add a comment |
$begingroup$
Just to help and understand the change of concept from physics to linear algebra about vectors, without pretending to be rigorous.
Consider that in physics (Newtonian) you consider an euclidean space, so you can speak in terms of magnitude. In linear algebra we want to be able and define a vector in broader terms, in a reference system that is not necessarily orthogonal, what is called an affine space/subspace.
Infact in affine geometry (which help to visualize) an oriented segment $mathop {AB}limits^ to$ is an ordered couple of points, and a vector corresponds to the ordered $n$-uple of the difference of their coordinates (the translation vector). A vector therefore is a representative of all the segments, oriented in the same direction, which are parallel and have the same "translation" (and not modulus, which is not defined, or better it is not preserved under an affine change of coordinates).
$endgroup$
Just to help and understand the change of concept from physics to linear algebra about vectors, without pretending to be rigorous.
Consider that in physics (Newtonian) you consider an euclidean space, so you can speak in terms of magnitude. In linear algebra we want to be able and define a vector in broader terms, in a reference system that is not necessarily orthogonal, what is called an affine space/subspace.
Infact in affine geometry (which help to visualize) an oriented segment $mathop {AB}limits^ to$ is an ordered couple of points, and a vector corresponds to the ordered $n$-uple of the difference of their coordinates (the translation vector). A vector therefore is a representative of all the segments, oriented in the same direction, which are parallel and have the same "translation" (and not modulus, which is not defined, or better it is not preserved under an affine change of coordinates).
answered Sep 22 '16 at 20:44
G CabG Cab
18.1k31237
18.1k31237
add a comment |
add a comment |
$begingroup$
Literally, an element, or a point, in a vector space, but to reach direction and magnitude, the vector space requires an inner product. Most every vector space you have seen does. An example vector space with inner product is $mathbb R^3$, which you probably use in physics a lot. In math, "element" and "point" are frequently interchangeable: the word "element" emphasizes the algebraic nature or simply set-based nature of the thing in question, whereas "point" emphasizes a geometric interpretation.
Inner products are not part of the definition of a vector space. A linear algebra course that always works with bases and matrices will not bother to define them since the basis of a finite dimensional space always defines an inner product. A theoretical linear algebra course will not include the inner product in the definition of a vector space, but will probably study them by end of semester.
You are more familiar with the "point" interpretation, it seems. So a vector is just a point. But, as a point, it comes with additional information, since a vector space also has an origin. Therefore each point corresponds to a line segment. The inner product gives the vector a direction from the origin to the point, and the inner product also gives the vector a magnitude. (Even more detail: every inner product automatically defines a norm, and the norm is mostly synonymous with magnitude).
Surely it is easier to say "the direction of a vector" than "the point defines a line segment," but you are right to be confused -- it's a lot of shorthand and skipped details to get from the barebones math of elements and sets, to intuitive geometric quantities of directions and quantities.
Every mathematician and physicist is fluent in this shorthand, and can apply it precisely as needed. You will see a lot of this in your math career, and you should always convince yourself that when steps are skipped, the steps are honest and precise.
$endgroup$
$begingroup$
@GrumpyParsnip fixed. pretty obvious typo, you can edit directly also.
$endgroup$
– djechlin
Sep 25 '16 at 20:50
add a comment |
$begingroup$
Literally, an element, or a point, in a vector space, but to reach direction and magnitude, the vector space requires an inner product. Most every vector space you have seen does. An example vector space with inner product is $mathbb R^3$, which you probably use in physics a lot. In math, "element" and "point" are frequently interchangeable: the word "element" emphasizes the algebraic nature or simply set-based nature of the thing in question, whereas "point" emphasizes a geometric interpretation.
Inner products are not part of the definition of a vector space. A linear algebra course that always works with bases and matrices will not bother to define them since the basis of a finite dimensional space always defines an inner product. A theoretical linear algebra course will not include the inner product in the definition of a vector space, but will probably study them by end of semester.
You are more familiar with the "point" interpretation, it seems. So a vector is just a point. But, as a point, it comes with additional information, since a vector space also has an origin. Therefore each point corresponds to a line segment. The inner product gives the vector a direction from the origin to the point, and the inner product also gives the vector a magnitude. (Even more detail: every inner product automatically defines a norm, and the norm is mostly synonymous with magnitude).
Surely it is easier to say "the direction of a vector" than "the point defines a line segment," but you are right to be confused -- it's a lot of shorthand and skipped details to get from the barebones math of elements and sets, to intuitive geometric quantities of directions and quantities.
Every mathematician and physicist is fluent in this shorthand, and can apply it precisely as needed. You will see a lot of this in your math career, and you should always convince yourself that when steps are skipped, the steps are honest and precise.
$endgroup$
$begingroup$
@GrumpyParsnip fixed. pretty obvious typo, you can edit directly also.
$endgroup$
– djechlin
Sep 25 '16 at 20:50
add a comment |
$begingroup$
Literally, an element, or a point, in a vector space, but to reach direction and magnitude, the vector space requires an inner product. Most every vector space you have seen does. An example vector space with inner product is $mathbb R^3$, which you probably use in physics a lot. In math, "element" and "point" are frequently interchangeable: the word "element" emphasizes the algebraic nature or simply set-based nature of the thing in question, whereas "point" emphasizes a geometric interpretation.
Inner products are not part of the definition of a vector space. A linear algebra course that always works with bases and matrices will not bother to define them since the basis of a finite dimensional space always defines an inner product. A theoretical linear algebra course will not include the inner product in the definition of a vector space, but will probably study them by end of semester.
You are more familiar with the "point" interpretation, it seems. So a vector is just a point. But, as a point, it comes with additional information, since a vector space also has an origin. Therefore each point corresponds to a line segment. The inner product gives the vector a direction from the origin to the point, and the inner product also gives the vector a magnitude. (Even more detail: every inner product automatically defines a norm, and the norm is mostly synonymous with magnitude).
Surely it is easier to say "the direction of a vector" than "the point defines a line segment," but you are right to be confused -- it's a lot of shorthand and skipped details to get from the barebones math of elements and sets, to intuitive geometric quantities of directions and quantities.
Every mathematician and physicist is fluent in this shorthand, and can apply it precisely as needed. You will see a lot of this in your math career, and you should always convince yourself that when steps are skipped, the steps are honest and precise.
$endgroup$
Literally, an element, or a point, in a vector space, but to reach direction and magnitude, the vector space requires an inner product. Most every vector space you have seen does. An example vector space with inner product is $mathbb R^3$, which you probably use in physics a lot. In math, "element" and "point" are frequently interchangeable: the word "element" emphasizes the algebraic nature or simply set-based nature of the thing in question, whereas "point" emphasizes a geometric interpretation.
Inner products are not part of the definition of a vector space. A linear algebra course that always works with bases and matrices will not bother to define them since the basis of a finite dimensional space always defines an inner product. A theoretical linear algebra course will not include the inner product in the definition of a vector space, but will probably study them by end of semester.
You are more familiar with the "point" interpretation, it seems. So a vector is just a point. But, as a point, it comes with additional information, since a vector space also has an origin. Therefore each point corresponds to a line segment. The inner product gives the vector a direction from the origin to the point, and the inner product also gives the vector a magnitude. (Even more detail: every inner product automatically defines a norm, and the norm is mostly synonymous with magnitude).
Surely it is easier to say "the direction of a vector" than "the point defines a line segment," but you are right to be confused -- it's a lot of shorthand and skipped details to get from the barebones math of elements and sets, to intuitive geometric quantities of directions and quantities.
Every mathematician and physicist is fluent in this shorthand, and can apply it precisely as needed. You will see a lot of this in your math career, and you should always convince yourself that when steps are skipped, the steps are honest and precise.
edited Sep 25 '16 at 20:50
answered Sep 25 '16 at 7:33
djechlindjechlin
4,7801230
4,7801230
$begingroup$
@GrumpyParsnip fixed. pretty obvious typo, you can edit directly also.
$endgroup$
– djechlin
Sep 25 '16 at 20:50
add a comment |
$begingroup$
@GrumpyParsnip fixed. pretty obvious typo, you can edit directly also.
$endgroup$
– djechlin
Sep 25 '16 at 20:50
$begingroup$
@GrumpyParsnip fixed. pretty obvious typo, you can edit directly also.
$endgroup$
– djechlin
Sep 25 '16 at 20:50
$begingroup$
@GrumpyParsnip fixed. pretty obvious typo, you can edit directly also.
$endgroup$
– djechlin
Sep 25 '16 at 20:50
add a comment |
$begingroup$
The YouTube channel 3Blue1Brown has recently put out an amazing short series on the "Essence of linear algebra". As it so happens, the first chapter is called "Vectors, what even are they?" and is an outstanding explanation, far simpler than any of the answers above: https://www.youtube.com/watch?v=fNk_zzaMoSs
While I highly recommend just watching the video (since vectors are really best understood visually), I'll try to summarize here: vectors are simply lists of numbers--that's really it. They can be used in a geometric sense (similar to the physics sense you're already familiar with on a grid) where each number represents coordinates relative to some "axes" (formally called "basis vectors"). In the most common case with basis vectors $hat{i}$, a 1-unit-long vector pointing directly right along the x-axis (represented as $[1, 0]$), and $hat{j}$, a 1-unit-long vector orthogonal to $hat{j}$ pointing up along the the y-axis (represented as $[0, 1]$), vectors are just coordinates on the plane. So in that case $[1, 1]$ is a vector pointing up and to the right from the origin to coordinate $(1, 1)$.
The video also goes into how vectors can also be seen as geometric transformations of the plane (e.g. squashing, stretching, shearing, or rotating), but that's something you really need to see to understand.
$endgroup$
$begingroup$
Better as a comment containing the link. or an answer with the link to help.
$endgroup$
– 6005
Sep 25 '16 at 6:18
1
$begingroup$
@6005: Good point. Added a short summary of the video.
$endgroup$
– 0x24a537r9
Sep 25 '16 at 6:35
add a comment |
$begingroup$
The YouTube channel 3Blue1Brown has recently put out an amazing short series on the "Essence of linear algebra". As it so happens, the first chapter is called "Vectors, what even are they?" and is an outstanding explanation, far simpler than any of the answers above: https://www.youtube.com/watch?v=fNk_zzaMoSs
While I highly recommend just watching the video (since vectors are really best understood visually), I'll try to summarize here: vectors are simply lists of numbers--that's really it. They can be used in a geometric sense (similar to the physics sense you're already familiar with on a grid) where each number represents coordinates relative to some "axes" (formally called "basis vectors"). In the most common case with basis vectors $hat{i}$, a 1-unit-long vector pointing directly right along the x-axis (represented as $[1, 0]$), and $hat{j}$, a 1-unit-long vector orthogonal to $hat{j}$ pointing up along the the y-axis (represented as $[0, 1]$), vectors are just coordinates on the plane. So in that case $[1, 1]$ is a vector pointing up and to the right from the origin to coordinate $(1, 1)$.
The video also goes into how vectors can also be seen as geometric transformations of the plane (e.g. squashing, stretching, shearing, or rotating), but that's something you really need to see to understand.
$endgroup$
$begingroup$
Better as a comment containing the link. or an answer with the link to help.
$endgroup$
– 6005
Sep 25 '16 at 6:18
1
$begingroup$
@6005: Good point. Added a short summary of the video.
$endgroup$
– 0x24a537r9
Sep 25 '16 at 6:35
add a comment |
$begingroup$
The YouTube channel 3Blue1Brown has recently put out an amazing short series on the "Essence of linear algebra". As it so happens, the first chapter is called "Vectors, what even are they?" and is an outstanding explanation, far simpler than any of the answers above: https://www.youtube.com/watch?v=fNk_zzaMoSs
While I highly recommend just watching the video (since vectors are really best understood visually), I'll try to summarize here: vectors are simply lists of numbers--that's really it. They can be used in a geometric sense (similar to the physics sense you're already familiar with on a grid) where each number represents coordinates relative to some "axes" (formally called "basis vectors"). In the most common case with basis vectors $hat{i}$, a 1-unit-long vector pointing directly right along the x-axis (represented as $[1, 0]$), and $hat{j}$, a 1-unit-long vector orthogonal to $hat{j}$ pointing up along the the y-axis (represented as $[0, 1]$), vectors are just coordinates on the plane. So in that case $[1, 1]$ is a vector pointing up and to the right from the origin to coordinate $(1, 1)$.
The video also goes into how vectors can also be seen as geometric transformations of the plane (e.g. squashing, stretching, shearing, or rotating), but that's something you really need to see to understand.
$endgroup$
The YouTube channel 3Blue1Brown has recently put out an amazing short series on the "Essence of linear algebra". As it so happens, the first chapter is called "Vectors, what even are they?" and is an outstanding explanation, far simpler than any of the answers above: https://www.youtube.com/watch?v=fNk_zzaMoSs
While I highly recommend just watching the video (since vectors are really best understood visually), I'll try to summarize here: vectors are simply lists of numbers--that's really it. They can be used in a geometric sense (similar to the physics sense you're already familiar with on a grid) where each number represents coordinates relative to some "axes" (formally called "basis vectors"). In the most common case with basis vectors $hat{i}$, a 1-unit-long vector pointing directly right along the x-axis (represented as $[1, 0]$), and $hat{j}$, a 1-unit-long vector orthogonal to $hat{j}$ pointing up along the the y-axis (represented as $[0, 1]$), vectors are just coordinates on the plane. So in that case $[1, 1]$ is a vector pointing up and to the right from the origin to coordinate $(1, 1)$.
The video also goes into how vectors can also be seen as geometric transformations of the plane (e.g. squashing, stretching, shearing, or rotating), but that's something you really need to see to understand.
edited Sep 25 '16 at 6:35
answered Sep 25 '16 at 6:04
0x24a537r90x24a537r9
1011
1011
$begingroup$
Better as a comment containing the link. or an answer with the link to help.
$endgroup$
– 6005
Sep 25 '16 at 6:18
1
$begingroup$
@6005: Good point. Added a short summary of the video.
$endgroup$
– 0x24a537r9
Sep 25 '16 at 6:35
add a comment |
$begingroup$
Better as a comment containing the link. or an answer with the link to help.
$endgroup$
– 6005
Sep 25 '16 at 6:18
1
$begingroup$
@6005: Good point. Added a short summary of the video.
$endgroup$
– 0x24a537r9
Sep 25 '16 at 6:35
$begingroup$
Better as a comment containing the link. or an answer with the link to help.
$endgroup$
– 6005
Sep 25 '16 at 6:18
$begingroup$
Better as a comment containing the link. or an answer with the link to help.
$endgroup$
– 6005
Sep 25 '16 at 6:18
1
1
$begingroup$
@6005: Good point. Added a short summary of the video.
$endgroup$
– 0x24a537r9
Sep 25 '16 at 6:35
$begingroup$
@6005: Good point. Added a short summary of the video.
$endgroup$
– 0x24a537r9
Sep 25 '16 at 6:35
add a comment |
$begingroup$
Your question is a nice example for how mathematics of today works and how certain notions emerged. There is no fancy definition of a vector in mathematics, the content of the meaning what a vector has to be was shifted to other objects: Mathematicians have abstracted certain properties of "objects" that appear in geometry or physics. This fits better the axiomatic requirements of today's mathematics.
Near the end of the 19th century, a "vector" was an ordered pair (A,B) of points in an affine space. This was also called a "fixed vector", where one could imagine (A,B) to be an arrow beginning at point A and ending with its tip at point B. In today's differential geometry one finds some relicts of this situation, when a "vector" is usually given together with its base point to which it is attached.
In mechanics, there appeared so called "line-bond vectors", vectors that were considered equivalent if they differed only by a translation along the line through A and B (if A is unequal to B). "Free vectors" were vectors considered to be equivalent if they differed only by a translation in the affine space. Free vectors can represent translations. Translations can be composed and inverted - they form a group. Translations can be scaled by multiplying with a number.
From these properties emerged what is called a "vector space".
Due to the axiomatic requirements of mathematics, one puts the cart before the horse:
First, one defines - abstractly - a "vector space" (over a field (K,+,0,$cdot$,1) ) to be a group (V,+,0) on which K acts "compatibly" via a homomorphism (of rings with unit) from the field K to the group endomorphisms of V:
(K,+,0,$cdot$,1) $to$ (Hom$_{Grp}$(V,V),+,0,$circ$,id$_V$).
Afterwards, one defines a "vector" to be an element of a vector space.
So the abstracted properties appear in the definition of a vector space, not in the definition of a vector.
The original geometric content of a vector appears only later as a very special case, when a real vector space acts on a (real) affine space as its space of translations, and when these translations are depicted eg. by arrows.
(N.B.: Historically spoken, a vector was not even a pair of points, but could have different meanings, e.g. as a pair of parallel planes in 3-dimensional affine space. It also took some time till the notion of a vector space emerged and till various fields K of scalars or even skew fields were admitted. Generalizing the notion of a K-vector field from a field K to a commutative ring R with unit gives today's notion of an R-module.)
$endgroup$
add a comment |
$begingroup$
Your question is a nice example for how mathematics of today works and how certain notions emerged. There is no fancy definition of a vector in mathematics, the content of the meaning what a vector has to be was shifted to other objects: Mathematicians have abstracted certain properties of "objects" that appear in geometry or physics. This fits better the axiomatic requirements of today's mathematics.
Near the end of the 19th century, a "vector" was an ordered pair (A,B) of points in an affine space. This was also called a "fixed vector", where one could imagine (A,B) to be an arrow beginning at point A and ending with its tip at point B. In today's differential geometry one finds some relicts of this situation, when a "vector" is usually given together with its base point to which it is attached.
In mechanics, there appeared so called "line-bond vectors", vectors that were considered equivalent if they differed only by a translation along the line through A and B (if A is unequal to B). "Free vectors" were vectors considered to be equivalent if they differed only by a translation in the affine space. Free vectors can represent translations. Translations can be composed and inverted - they form a group. Translations can be scaled by multiplying with a number.
From these properties emerged what is called a "vector space".
Due to the axiomatic requirements of mathematics, one puts the cart before the horse:
First, one defines - abstractly - a "vector space" (over a field (K,+,0,$cdot$,1) ) to be a group (V,+,0) on which K acts "compatibly" via a homomorphism (of rings with unit) from the field K to the group endomorphisms of V:
(K,+,0,$cdot$,1) $to$ (Hom$_{Grp}$(V,V),+,0,$circ$,id$_V$).
Afterwards, one defines a "vector" to be an element of a vector space.
So the abstracted properties appear in the definition of a vector space, not in the definition of a vector.
The original geometric content of a vector appears only later as a very special case, when a real vector space acts on a (real) affine space as its space of translations, and when these translations are depicted eg. by arrows.
(N.B.: Historically spoken, a vector was not even a pair of points, but could have different meanings, e.g. as a pair of parallel planes in 3-dimensional affine space. It also took some time till the notion of a vector space emerged and till various fields K of scalars or even skew fields were admitted. Generalizing the notion of a K-vector field from a field K to a commutative ring R with unit gives today's notion of an R-module.)
$endgroup$
add a comment |
$begingroup$
Your question is a nice example for how mathematics of today works and how certain notions emerged. There is no fancy definition of a vector in mathematics, the content of the meaning what a vector has to be was shifted to other objects: Mathematicians have abstracted certain properties of "objects" that appear in geometry or physics. This fits better the axiomatic requirements of today's mathematics.
Near the end of the 19th century, a "vector" was an ordered pair (A,B) of points in an affine space. This was also called a "fixed vector", where one could imagine (A,B) to be an arrow beginning at point A and ending with its tip at point B. In today's differential geometry one finds some relicts of this situation, when a "vector" is usually given together with its base point to which it is attached.
In mechanics, there appeared so called "line-bond vectors", vectors that were considered equivalent if they differed only by a translation along the line through A and B (if A is unequal to B). "Free vectors" were vectors considered to be equivalent if they differed only by a translation in the affine space. Free vectors can represent translations. Translations can be composed and inverted - they form a group. Translations can be scaled by multiplying with a number.
From these properties emerged what is called a "vector space".
Due to the axiomatic requirements of mathematics, one puts the cart before the horse:
First, one defines - abstractly - a "vector space" (over a field (K,+,0,$cdot$,1) ) to be a group (V,+,0) on which K acts "compatibly" via a homomorphism (of rings with unit) from the field K to the group endomorphisms of V:
(K,+,0,$cdot$,1) $to$ (Hom$_{Grp}$(V,V),+,0,$circ$,id$_V$).
Afterwards, one defines a "vector" to be an element of a vector space.
So the abstracted properties appear in the definition of a vector space, not in the definition of a vector.
The original geometric content of a vector appears only later as a very special case, when a real vector space acts on a (real) affine space as its space of translations, and when these translations are depicted eg. by arrows.
(N.B.: Historically spoken, a vector was not even a pair of points, but could have different meanings, e.g. as a pair of parallel planes in 3-dimensional affine space. It also took some time till the notion of a vector space emerged and till various fields K of scalars or even skew fields were admitted. Generalizing the notion of a K-vector field from a field K to a commutative ring R with unit gives today's notion of an R-module.)
$endgroup$
Your question is a nice example for how mathematics of today works and how certain notions emerged. There is no fancy definition of a vector in mathematics, the content of the meaning what a vector has to be was shifted to other objects: Mathematicians have abstracted certain properties of "objects" that appear in geometry or physics. This fits better the axiomatic requirements of today's mathematics.
Near the end of the 19th century, a "vector" was an ordered pair (A,B) of points in an affine space. This was also called a "fixed vector", where one could imagine (A,B) to be an arrow beginning at point A and ending with its tip at point B. In today's differential geometry one finds some relicts of this situation, when a "vector" is usually given together with its base point to which it is attached.
In mechanics, there appeared so called "line-bond vectors", vectors that were considered equivalent if they differed only by a translation along the line through A and B (if A is unequal to B). "Free vectors" were vectors considered to be equivalent if they differed only by a translation in the affine space. Free vectors can represent translations. Translations can be composed and inverted - they form a group. Translations can be scaled by multiplying with a number.
From these properties emerged what is called a "vector space".
Due to the axiomatic requirements of mathematics, one puts the cart before the horse:
First, one defines - abstractly - a "vector space" (over a field (K,+,0,$cdot$,1) ) to be a group (V,+,0) on which K acts "compatibly" via a homomorphism (of rings with unit) from the field K to the group endomorphisms of V:
(K,+,0,$cdot$,1) $to$ (Hom$_{Grp}$(V,V),+,0,$circ$,id$_V$).
Afterwards, one defines a "vector" to be an element of a vector space.
So the abstracted properties appear in the definition of a vector space, not in the definition of a vector.
The original geometric content of a vector appears only later as a very special case, when a real vector space acts on a (real) affine space as its space of translations, and when these translations are depicted eg. by arrows.
(N.B.: Historically spoken, a vector was not even a pair of points, but could have different meanings, e.g. as a pair of parallel planes in 3-dimensional affine space. It also took some time till the notion of a vector space emerged and till various fields K of scalars or even skew fields were admitted. Generalizing the notion of a K-vector field from a field K to a commutative ring R with unit gives today's notion of an R-module.)
answered Jan 8 at 14:02
ASlateffASlateff
263
263
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1937464%2fin-linear-algebra-what-is-a-vector%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
98
$begingroup$
A vector is simply an element of a vector space.
$endgroup$
– Mariano Suárez-Álvarez
Sep 22 '16 at 18:59
5
$begingroup$
Do not think too much about what a vector or a vector space is. It is just an algebraic structure which seems to have extremely interesting properties, which then are applied everywhere in science.
$endgroup$
– user305860
Sep 22 '16 at 19:00
13
$begingroup$
"a vector space is a collection of vectors that..." You should avoid using the word "vectors" there, and simply say, "A vector space is a set $S$ together with a field $F$ and functions $+$ and $times$ such that the following axioms are satisfied..." Then you have avoided using the word "vector", so you don't need to define it. If you'd like you can refer to the elements of $S$ as "vectors" to remind people that $(S,F,+,times)$ is a vector space.
$endgroup$
– littleO
Sep 22 '16 at 23:36
4
$begingroup$
The reason is that physicists care about the definition of a vector. In linear algebra, the definition of a vector is irrelevant, what is important is the definition of a vector space.
$endgroup$
– Asaf Karagila♦
Sep 23 '16 at 4:53
1
$begingroup$
@AsafKaragila: Sort of. Actually in modern physics they care that a quantity transforms like a vector. So ultimately it's still defined by behaviour, just that it takes more behaviour than just the vector space axioms.
$endgroup$
– celtschk
Sep 23 '16 at 6:27