Topic: General Level: All
Welcome to the world of cutting-edge technology! Every bi-week, we bring you the latest and most incredible advancements in the tech industry that are sure to leave you feeling inspired and empowered.
Stay ahead of the game and be the first to know about the newest innovations shaping our world. Discover new ways to improve your daily life, become more efficient, and enjoy new experiences.
This time, we've got some exciting news to share with you!
Java Streams API operating on JSON documents is possible in two ways,
1. JsonObject accessing the JSON elements by name and processing it in string representation ie., the terminal operation JsonCollectors returns a JSON Array output
2. Jackson databinding with JsonNode reading the JSON as a string and using the JsonNode as stream
ChatGPT-generated Microservices with step-by-step inputs.
By context correlation it automatically integrated the controller part with the API and POJOs are of significance!!
JEP 431 of Java 21 accommodates Sequenced Collections which are,
1. Interface - Sequenced Collections extending Collections and in turn extended by List and Deque
interface SequencedCollection<E> extends Collection<E> {
// new method
SequencedCollection<E> reversed();
// methods promoted from Deque
void addFirst(E);
void addLast(E);
E getFirst();
E getLast();
E removeFirst();
E removeLast();
}
2. Interface - Sequenced Set extending Set and Sequenced Collections and in turn extended by SortedSet and implements LinkedHashSet
interface SequencedSet<E> extends Set<E>, SequencedCollection<E> {
SequencedSet<E> reversed(); // covariant override
}
3. Interface - Sequenced Map extending Map and in turn extended by Sorted Map and implements LinkedHashMap
interface SequencedMap<K,V> extends Map<K,V> {
// new methods
SequencedMap<K,V> reversed();
SequencedSet<K> sequencedKeySet();
SequencedCollection<V> sequencedValues();
SequencedSet<Entry<K,V>> sequencedEntrySet();
V putFirst(K, V);
V putLast(K, V);
// methods promoted from NavigableMap
Entry<K, V> firstEntry();
Entry<K, V> lastEntry();
Entry<K, V> pollFirstEntry();
Entry<K, V> pollLastEntry();
}
These interfaces kind of follow the doubly linked list data structure offering the ability to access the front and last elements of the collection as well as reversals
The perspective of Reflection applies in the Java runtime execution container that is the JVM, which oversees the intricacies of memory handling via Garbage collection and the runtime type information obtained by the class loader as a graph of the details of the inheritance hierarchy and interface implementation for all types.
The object header holds this metadata information categorized as
1. type-specific - Klass word resides in metaspace and contains a pointer to the metadata of the class
2. instance-specific - Mark word for intrinsic locks and garbage collection
Exposing this runtime metadata information and allowing accessibility is a reflection.
Reflection can be combined with class loading to provide the capability for Java programs to work with code that was completely unknown at compile time.
This approach is the basis of plugin architectures and other very dynamic mechanisms that Java can make use of.
Reflections allow variadic (variable arguments) to be passed during an instance creation or method invocations.
Deliberations of reflection at,
Dealing with exceptions - NoSuchMethodException missing method in the class, IllegalAccessException - access controlled, InstantiationException - unable to create an object via the constructor, InvocationTargetException - target method rethrown exception
Overloading - looked up via getMethod with name and parameters
Access control - getMethod(), is used for looking up public methods. By contrast, getDeclaredMethod() can be used to find any method declared on a class, even private methods.
Possible to override those semantics because the API provides the setAccessible()
Autoboxing and casting - lookup methods (such as getConstructor() and getMethod()) take Class<?> objects as parameters, so you can simply pass the class literals corresponding to primitive types, such as int.class
Since Reflection was prior to Collections, the package java.lang.reflect uses array representation
Running concurrent processes on the stream elements can be attained via.
1. Collections.parallelStream()
2. BaseStream.parallel()
The variation arises in the way we partition the stream data for parallel processing.
In the Collections.parallelStream() variant, with the Collection interface implementing the Spliterator can be custom defined or fall back to the default interface trySplit() implementation providing either a user-defined split mechanism or using the existing library behavior
In the BaseStream. the parallel() variant, will always rely on the default implementation and cannot be custom implemented
@implSpec
* The default implementation creates a parallel {@code Stream} from the
* collection's {@code Spliterator}.
*
* @return a possibly parallel {@code Stream} over the elements in this
* collection
* @since 1.8
*/
default Stream<E> parallelStream() {
return StreamSupport.stream(spliterator(), true);
}
With Spring Data JPA Specification multiple predicates can be incorporated into a single piped query and can perform filtering, like match, comparison, sorting, and other SQL functions.
This is attainable by extending the interface JpaSpecificationExecutor and providing default implementation for the specifications in the entity repository
Identifying memory leaks in an application could be a daunting task, as the investigation of thread hierarchies traversing throughout the application code, 3rd party frameworks, executor pools, proper lifecycle management of threads, and multiple facets needs to be validated to narrow down on the source of inception.
However, certain tools at your disposal would alleviate the complication of the thread analysis,
1. jstack - thread dump snapshot
2. JDK Flight recorder - jdk.JavaThreadStatistics recording the thread events - metrics on active threads
jdk.ThreadStart and jdk.ThreadEnd - providing information on thread initiation and tear down
3. JFR Analytics - analyze JFR recordings using standard SQL (leveraging Apache Calcite under the hood) - allowing you to hierarchical traverse to the origination of thread
For flame graph generations and thread stack analysis, we rely on Profilers, creating a custom lightweight profiler is possible as well,
Disclaimer:
This is a personal blog. Any views or opinions represented in this blog are personal and belong solely to the blog owner and do not represent those of people, institutions or organizations that the owner may or may not be associated with in a professional or personal capacity, unless explicitly stated. Any views or opinions are not intended to malign any religion, ethnic group, club, organization, company, or individual. All content provided on this blog is for informational purposes only. The owner of this blog makes no representations as to the accuracy or completeness of any information on this site or found by following any link on this site. The owner will not be liable for any errors or omissions in this information nor for the availability of this information. The owner will not be liable for any losses, injuries, or damages from the display or use of this information.
Downloadable Files and ImagesAny downloadable file, including but not limited to pdfs, docs, jpegs, pngs, is provided at the user’s own risk. The owner will not be liable for any losses, injuries, or damages resulting from a corrupted or damaged file.- Comments are welcome. However, the blog owner reserves the right to edit or delete any comments submitted to this blog without notice due to :
- Comments deemed to be spam or questionable spam.
- Comments including profanity.
- Comments containing language or concepts that could be deemed offensive.
- Comments containing hate speech, credible threats, or direct attacks on an individual or group.
Comments
Post a Comment