Skip to content

Commit 3a0489e

Browse files
JimClarke5rnett
andauthoredFeb 1, 2021
Metrics Phase 1 (#180)
* Initial checkin * Initial checkin and sync with master * Initial checkin and sync with master * JavaDoc cleanup * Javadoc fixes * Change LossInterface to LossMetric. Fix JavaDoc, modify one line code block to include braces. * Removed hashmap for variables, they are not needed as the variables only live within a single instance of a Metric. * reformat code * Add tests for assertBroadcastable * Change type to resultType * Added V data type for sampleWeights so that it is not forced to be the same type as the return or internal variables, * change 'type' to 'resultType' * clean up mean and fix assert assertBroadcastable * fix error message * Change sampleWeights to have its own generic type <S extends TNumber> * Add commment about invalid tests expecting IllegalArgumentExceptions * Add this exception instead of the more generic IllegalArgumentException when static shapes cannot boradcast. * change IllegalArgumentException to NotBroadcastableException. change hasValidNonscalarShape to canBroadcastNonscalarShapes change hasValidNonscalarShape to canBroadcastNonscalarShapes * reformat code * Fis=x Javadoc move the dynamic shapes and rank down to the dynamic section so they are created needlessly when static Fix if statement to check for unknown size and unknown dimensions * Fix Reduce to use boradcastWeights, renamed WeightBroadcastTest to AssertBroadcastableTest and added BroadcastWeightsTest * Added comment to count to indicate that it may be weighted. * Added SetsOps and fixed AssertBroadcastable to use SetsOps methods, * Fixed based on various PR comments. * Deleted, no longer needed after change to Variable handling in Metrics. * Nicer error messages for mode-forbidden ops (#169) * start fobbiden ops checks Signed-off-by: Ryan Nett <rnett@calpoly.edu> * fix style Signed-off-by: Ryan Nett <rnett@calpoly.edu> * move checks to builder method Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Initialization imprvements (#178) * No-op on initAdd in eager mode Signed-off-by: Ryan Nett <rnett@calpoly.edu> * runInit() method in session Signed-off-by: Ryan Nett <rnett@calpoly.edu> * add doInitialization() to Runner Signed-off-by: Ryan Nett <rnett@calpoly.edu> * fix javadoc Signed-off-by: Ryan Nett <rnett@calpoly.edu> * assume only graph or eager environments Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Remove doInit(), update javadocs Signed-off-by: Ryan Nett <rnett@calpoly.edu> * small fixes Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Clairify tensorOf lifetime requirements (#190) * Clairify tensorOf lifetime requirements Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Do codegen Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Remove extra generics from op generation (#193) * Successfully remove extra type params, but it broke javadoc generation Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Generate covariant types Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Do generation Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Update help text. Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Fixes Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Add Java 11 support - Initial Phase (#185) * Add profile for JDK11 and Automatic-Module-Name to jars * add maven.compiler.release=11 * Update manual ops for new codegen (#196) Signed-off-by: Ryan Nett <rnett@calpoly.edu> * Fix Losses to use CHANNELS_FIRST/LAST for CategoricalCrossentropy * Fix SetOps to properly convert sparse tensor to dense tensor using tf.sparse.sparseToDense with the output of tf.sparse.denseToDenseSetOperation * Initial checkin * Initial checkin and sync with master * Initial checkin and sync with master * JavaDoc cleanup * Javadoc fixes * Change LossInterface to LossMetric. Fix JavaDoc, modify one line code block to include braces. * Removed hashmap for variables, they are not needed as the variables only live within a single instance of a Metric. * reformat code * Add tests for assertBroadcastable * Change type to resultType * Added V data type for sampleWeights so that it is not forced to be the same type as the return or internal variables, * change 'type' to 'resultType' * clean up mean and fix assert assertBroadcastable * fix error message * Change sampleWeights to have its own generic type <S extends TNumber> * Add commment about invalid tests expecting IllegalArgumentExceptions * Add this exception instead of the more generic IllegalArgumentException when static shapes cannot boradcast. * change IllegalArgumentException to NotBroadcastableException. change hasValidNonscalarShape to canBroadcastNonscalarShapes change hasValidNonscalarShape to canBroadcastNonscalarShapes * reformat code * Fis=x Javadoc move the dynamic shapes and rank down to the dynamic section so they are created needlessly when static Fix if statement to check for unknown size and unknown dimensions * Fix Reduce to use boradcastWeights, renamed WeightBroadcastTest to AssertBroadcastableTest and added BroadcastWeightsTest * Added comment to count to indicate that it may be weighted. * Added SetsOps and fixed AssertBroadcastable to use SetsOps methods, * Fixed based on various PR comments. * Deleted, no longer needed after change to Variable handling in Metrics. * Fix Losses to use CHANNELS_FIRST/LAST for CategoricalCrossentropy * Fix SetOps to properly convert sparse tensor to dense tensor using tf.sparse.sparseToDense with the output of tf.sparse.denseToDenseSetOperation Co-authored-by: Ryan Nett <rnett@calpoly.edu>
1 parent 496191d commit 3a0489e

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

45 files changed

+4497
-18
lines changed
 

Diff for: ‎tensorflow-framework/src/main/java/org/tensorflow/framework/losses/CategoricalCrossentropy.java

+4-3
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@
6969
public class CategoricalCrossentropy extends Loss {
7070
public static final boolean FROM_LOGITS_DEFAULT = false;
7171
public static final float LABEL_SMOOTHING_DEFAULT = 0.0f;
72-
public static final int DEFAULT_AXIS = -1;
72+
public static final int DEFAULT_AXIS = Losses.CHANNELS_LAST;
7373

7474
private final boolean fromLogits;
7575
private final float labelSmoothing;
@@ -203,8 +203,9 @@ public CategoricalCrossentropy(
203203
* confidence on label values are relaxed. e.g. <code>labelSmoothing=0.2</code> means that we will use a
204204
* value of <code>0.1</code> for label <code>0</code> and <code>0.9</code> for label <code>1</code>
205205
* @param reduction Type of Reduction to apply to loss.
206-
* @param axis The channels axis. <code>axis=-1</code> corresponds to data format `Channels Last'
207-
* and <code>axis=1</code> corresponds to data format 'Channels First'.
206+
* @param axis The channels axis. <code>axis=-1</code> corresponds to data format "Channels Last"
207+
* and <code>axis=1</code> corresponds to data format "Channels First".
208+
* {@link Losses#CHANNELS_LAST} and {@link Losses#CHANNELS_FIRST}
208209
* @throws IllegalArgumentException if labelSmoothing is not in the inclusive range of 0. - 1.
209210
*/
210211
public CategoricalCrossentropy(

Diff for: ‎tensorflow-framework/src/main/java/org/tensorflow/framework/losses/Losses.java

+4-1
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,9 @@ public class Losses {
3636
/** Default Fuzz factor. */
3737
public static final float EPSILON = 1e-7f;
3838

39+
public static final int CHANNELS_LAST = -1;
40+
public static final int CHANNELS_FIRST = 1;
41+
3942
/**
4043
* Calculates the mean absolute error between labels and predictions.
4144
*
@@ -239,7 +242,7 @@ public static <T extends TNumber, U extends TNumber> Operand<T> categoricalCross
239242
tLabels = smoothCategoricalLabels(tf, tLabels, labelSmoothing);
240243
}
241244
if (fromLogits) {
242-
return tf.nn.softmaxCrossEntropyWithLogits(tLabels, predictions, -1);
245+
return tf.nn.softmaxCrossEntropyWithLogits(tLabels, predictions, axis);
243246
}
244247
/* TODO
245248
if (!(predictions instanceof Variable) && (!tf.scope().env().isEager())) {

0 commit comments

Comments
 (0)
Please sign in to comment.